[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15406 1726854931.45818: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-ZzD executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15406 1726854931.46736: Added group all to inventory 15406 1726854931.46738: Added group ungrouped to inventory 15406 1726854931.46742: Group all now contains ungrouped 15406 1726854931.46746: Examining possible inventory source: /tmp/network-Koj/inventory.yml 15406 1726854931.73310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15406 1726854931.73424: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15406 1726854931.73591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15406 1726854931.73650: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15406 1726854931.74067: Loaded config def from plugin (inventory/script) 15406 1726854931.74069: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15406 1726854931.74320: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15406 1726854931.74569: Loaded config def from plugin (inventory/yaml) 15406 1726854931.74571: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15406 1726854931.74884: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15406 1726854931.75905: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15406 1726854931.75909: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15406 1726854931.75911: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15406 1726854931.75917: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15406 1726854931.75922: Loading data from /tmp/network-Koj/inventory.yml 15406 1726854931.75994: /tmp/network-Koj/inventory.yml was not parsable by auto 15406 1726854931.76266: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15406 1726854931.76309: Loading data from /tmp/network-Koj/inventory.yml 15406 1726854931.76512: group all already in inventory 15406 1726854931.76519: set inventory_file for managed_node1 15406 1726854931.76523: set inventory_dir for managed_node1 15406 1726854931.76524: Added host managed_node1 to inventory 15406 1726854931.76526: Added host managed_node1 to group all 15406 1726854931.76527: set ansible_host for managed_node1 15406 1726854931.76528: set ansible_ssh_extra_args for managed_node1 15406 1726854931.76531: set inventory_file for managed_node2 15406 1726854931.76534: set inventory_dir for managed_node2 15406 1726854931.76534: Added host managed_node2 to inventory 15406 1726854931.76536: Added host managed_node2 to group all 15406 1726854931.76537: set ansible_host for managed_node2 15406 1726854931.76537: set ansible_ssh_extra_args for managed_node2 15406 1726854931.76540: set inventory_file for managed_node3 15406 1726854931.76542: set inventory_dir for managed_node3 15406 1726854931.76543: Added host managed_node3 to inventory 15406 1726854931.76683: Added host managed_node3 to group all 15406 1726854931.76685: set ansible_host for managed_node3 15406 1726854931.76686: set ansible_ssh_extra_args for managed_node3 15406 1726854931.76690: Reconcile groups and hosts in inventory. 15406 1726854931.76695: Group ungrouped now contains managed_node1 15406 1726854931.76697: Group ungrouped now contains managed_node2 15406 1726854931.76698: Group ungrouped now contains managed_node3 15406 1726854931.76910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15406 1726854931.77095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15406 1726854931.77149: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15406 1726854931.77191: Loaded config def from plugin (vars/host_group_vars) 15406 1726854931.77194: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15406 1726854931.77205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15406 1726854931.77213: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15406 1726854931.77254: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15406 1726854931.77628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854931.77733: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15406 1726854931.77778: Loaded config def from plugin (connection/local) 15406 1726854931.77782: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15406 1726854931.78566: Loaded config def from plugin (connection/paramiko_ssh) 15406 1726854931.78569: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15406 1726854931.80667: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15406 1726854931.80712: Loaded config def from plugin (connection/psrp) 15406 1726854931.80715: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15406 1726854931.81491: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15406 1726854931.81537: Loaded config def from plugin (connection/ssh) 15406 1726854931.81540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15406 1726854931.85248: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15406 1726854931.85345: Loaded config def from plugin (connection/winrm) 15406 1726854931.85349: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15406 1726854931.85391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15406 1726854931.85499: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15406 1726854931.85572: Loaded config def from plugin (shell/cmd) 15406 1726854931.85574: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15406 1726854931.85611: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15406 1726854931.85683: Loaded config def from plugin (shell/powershell) 15406 1726854931.85685: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15406 1726854931.85751: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15406 1726854931.85945: Loaded config def from plugin (shell/sh) 15406 1726854931.85951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15406 1726854931.85989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15406 1726854931.86119: Loaded config def from plugin (become/runas) 15406 1726854931.86121: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15406 1726854931.86326: Loaded config def from plugin (become/su) 15406 1726854931.86329: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15406 1726854931.86506: Loaded config def from plugin (become/sudo) 15406 1726854931.86508: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15406 1726854931.86540: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15406 1726854931.87069: in VariableManager get_vars() 15406 1726854931.87118: done with get_vars() 15406 1726854931.87480: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15406 1726854931.94014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15406 1726854931.94245: in VariableManager get_vars() 15406 1726854931.94251: done with get_vars() 15406 1726854931.94254: variable 'playbook_dir' from source: magic vars 15406 1726854931.94255: variable 'ansible_playbook_python' from source: magic vars 15406 1726854931.94256: variable 'ansible_config_file' from source: magic vars 15406 1726854931.94256: variable 'groups' from source: magic vars 15406 1726854931.94257: variable 'omit' from source: magic vars 15406 1726854931.94258: variable 'ansible_version' from source: magic vars 15406 1726854931.94258: variable 'ansible_check_mode' from source: magic vars 15406 1726854931.94259: variable 'ansible_diff_mode' from source: magic vars 15406 1726854931.94260: variable 'ansible_forks' from source: magic vars 15406 1726854931.94260: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854931.94261: variable 'ansible_skip_tags' from source: magic vars 15406 1726854931.94262: variable 'ansible_limit' from source: magic vars 15406 1726854931.94262: variable 'ansible_run_tags' from source: magic vars 15406 1726854931.94263: variable 'ansible_verbosity' from source: magic vars 15406 1726854931.94513: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15406 1726854931.95458: in VariableManager get_vars() 15406 1726854931.95595: done with get_vars() 15406 1726854931.95637: in VariableManager get_vars() 15406 1726854931.95651: done with get_vars() 15406 1726854931.95688: in VariableManager get_vars() 15406 1726854931.95816: done with get_vars() 15406 1726854931.96002: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15406 1726854931.96390: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15406 1726854931.96626: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15406 1726854931.98108: in VariableManager get_vars() 15406 1726854931.98130: done with get_vars() 15406 1726854931.98997: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15406 1726854931.99254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15406 1726854932.01041: in VariableManager get_vars() 15406 1726854932.01054: done with get_vars() 15406 1726854932.01057: variable 'playbook_dir' from source: magic vars 15406 1726854932.01058: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.01059: variable 'ansible_config_file' from source: magic vars 15406 1726854932.01059: variable 'groups' from source: magic vars 15406 1726854932.01060: variable 'omit' from source: magic vars 15406 1726854932.01061: variable 'ansible_version' from source: magic vars 15406 1726854932.01062: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.01063: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.01063: variable 'ansible_forks' from source: magic vars 15406 1726854932.01064: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.01066: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.01066: variable 'ansible_limit' from source: magic vars 15406 1726854932.01067: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.01068: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.01103: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15406 1726854932.01217: in VariableManager get_vars() 15406 1726854932.01233: done with get_vars() 15406 1726854932.01274: in VariableManager get_vars() 15406 1726854932.01280: done with get_vars() 15406 1726854932.01282: variable 'playbook_dir' from source: magic vars 15406 1726854932.01283: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.01284: variable 'ansible_config_file' from source: magic vars 15406 1726854932.01285: variable 'groups' from source: magic vars 15406 1726854932.01286: variable 'omit' from source: magic vars 15406 1726854932.01286: variable 'ansible_version' from source: magic vars 15406 1726854932.01289: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.01290: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.01290: variable 'ansible_forks' from source: magic vars 15406 1726854932.01291: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.01292: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.01293: variable 'ansible_limit' from source: magic vars 15406 1726854932.01294: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.01294: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.01327: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15406 1726854932.01406: in VariableManager get_vars() 15406 1726854932.01420: done with get_vars() 15406 1726854932.01470: in VariableManager get_vars() 15406 1726854932.01473: done with get_vars() 15406 1726854932.01475: variable 'playbook_dir' from source: magic vars 15406 1726854932.01476: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.01479: variable 'ansible_config_file' from source: magic vars 15406 1726854932.01480: variable 'groups' from source: magic vars 15406 1726854932.01489: variable 'omit' from source: magic vars 15406 1726854932.01491: variable 'ansible_version' from source: magic vars 15406 1726854932.01492: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.01493: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.01493: variable 'ansible_forks' from source: magic vars 15406 1726854932.01498: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.01499: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.01500: variable 'ansible_limit' from source: magic vars 15406 1726854932.01501: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.01501: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.01532: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15406 1726854932.01619: in VariableManager get_vars() 15406 1726854932.01623: done with get_vars() 15406 1726854932.01625: variable 'playbook_dir' from source: magic vars 15406 1726854932.01626: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.01626: variable 'ansible_config_file' from source: magic vars 15406 1726854932.01627: variable 'groups' from source: magic vars 15406 1726854932.01628: variable 'omit' from source: magic vars 15406 1726854932.01629: variable 'ansible_version' from source: magic vars 15406 1726854932.01629: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.01630: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.01631: variable 'ansible_forks' from source: magic vars 15406 1726854932.01632: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.01632: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.01633: variable 'ansible_limit' from source: magic vars 15406 1726854932.01634: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.01634: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.01663: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15406 1726854932.01766: in VariableManager get_vars() 15406 1726854932.01781: done with get_vars() 15406 1726854932.01837: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15406 1726854932.01957: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15406 1726854932.02071: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15406 1726854932.02551: in VariableManager get_vars() 15406 1726854932.02575: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15406 1726854932.05453: in VariableManager get_vars() 15406 1726854932.05470: done with get_vars() 15406 1726854932.05624: in VariableManager get_vars() 15406 1726854932.05628: done with get_vars() 15406 1726854932.05630: variable 'playbook_dir' from source: magic vars 15406 1726854932.05631: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.05632: variable 'ansible_config_file' from source: magic vars 15406 1726854932.05633: variable 'groups' from source: magic vars 15406 1726854932.05633: variable 'omit' from source: magic vars 15406 1726854932.05634: variable 'ansible_version' from source: magic vars 15406 1726854932.05635: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.05635: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.05636: variable 'ansible_forks' from source: magic vars 15406 1726854932.05637: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.05637: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.05638: variable 'ansible_limit' from source: magic vars 15406 1726854932.05639: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.05640: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.05672: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15406 1726854932.05790: in VariableManager get_vars() 15406 1726854932.05802: done with get_vars() 15406 1726854932.05955: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15406 1726854932.06356: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15406 1726854932.06549: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15406 1726854932.10753: in VariableManager get_vars() 15406 1726854932.10893: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15406 1726854932.12871: in VariableManager get_vars() 15406 1726854932.12874: done with get_vars() 15406 1726854932.12879: variable 'playbook_dir' from source: magic vars 15406 1726854932.12880: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.12881: variable 'ansible_config_file' from source: magic vars 15406 1726854932.12882: variable 'groups' from source: magic vars 15406 1726854932.12883: variable 'omit' from source: magic vars 15406 1726854932.12883: variable 'ansible_version' from source: magic vars 15406 1726854932.12884: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.12885: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.12886: variable 'ansible_forks' from source: magic vars 15406 1726854932.12886: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.12889: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.12890: variable 'ansible_limit' from source: magic vars 15406 1726854932.12890: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.12891: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.13024: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15406 1726854932.13208: in VariableManager get_vars() 15406 1726854932.13233: done with get_vars() 15406 1726854932.13315: in VariableManager get_vars() 15406 1726854932.13319: done with get_vars() 15406 1726854932.13321: variable 'playbook_dir' from source: magic vars 15406 1726854932.13322: variable 'ansible_playbook_python' from source: magic vars 15406 1726854932.13323: variable 'ansible_config_file' from source: magic vars 15406 1726854932.13324: variable 'groups' from source: magic vars 15406 1726854932.13325: variable 'omit' from source: magic vars 15406 1726854932.13325: variable 'ansible_version' from source: magic vars 15406 1726854932.13326: variable 'ansible_check_mode' from source: magic vars 15406 1726854932.13327: variable 'ansible_diff_mode' from source: magic vars 15406 1726854932.13327: variable 'ansible_forks' from source: magic vars 15406 1726854932.13328: variable 'ansible_inventory_sources' from source: magic vars 15406 1726854932.13329: variable 'ansible_skip_tags' from source: magic vars 15406 1726854932.13330: variable 'ansible_limit' from source: magic vars 15406 1726854932.13330: variable 'ansible_run_tags' from source: magic vars 15406 1726854932.13331: variable 'ansible_verbosity' from source: magic vars 15406 1726854932.13360: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15406 1726854932.13544: in VariableManager get_vars() 15406 1726854932.13557: done with get_vars() 15406 1726854932.13750: in VariableManager get_vars() 15406 1726854932.13762: done with get_vars() 15406 1726854932.14032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15406 1726854932.14043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15406 1726854932.14972: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15406 1726854932.15459: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15406 1726854932.15468: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 15406 1726854932.15622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15406 1726854932.15648: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15406 1726854932.15906: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15406 1726854932.16075: Loaded config def from plugin (callback/default) 15406 1726854932.16078: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15406 1726854932.17454: Loaded config def from plugin (callback/junit) 15406 1726854932.17457: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15406 1726854932.17507: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15406 1726854932.17577: Loaded config def from plugin (callback/minimal) 15406 1726854932.17580: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15406 1726854932.17620: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15406 1726854932.17676: Loaded config def from plugin (callback/tree) 15406 1726854932.17679: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15406 1726854932.17803: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15406 1726854932.17806: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15406 1726854932.17832: in VariableManager get_vars() 15406 1726854932.17846: done with get_vars() 15406 1726854932.17852: in VariableManager get_vars() 15406 1726854932.17860: done with get_vars() 15406 1726854932.17863: variable 'omit' from source: magic vars 15406 1726854932.17908: in VariableManager get_vars() 15406 1726854932.17923: done with get_vars() 15406 1726854932.17944: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15406 1726854932.18492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15406 1726854932.18569: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15406 1726854932.18601: getting the remaining hosts for this loop 15406 1726854932.18603: done getting the remaining hosts for this loop 15406 1726854932.18605: getting the next task for host managed_node2 15406 1726854932.18609: done getting next task for host managed_node2 15406 1726854932.18610: ^ task is: TASK: Gathering Facts 15406 1726854932.18612: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854932.18614: getting variables 15406 1726854932.18615: in VariableManager get_vars() 15406 1726854932.18624: Calling all_inventory to load vars for managed_node2 15406 1726854932.18626: Calling groups_inventory to load vars for managed_node2 15406 1726854932.18629: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854932.18640: Calling all_plugins_play to load vars for managed_node2 15406 1726854932.18651: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854932.18654: Calling groups_plugins_play to load vars for managed_node2 15406 1726854932.18694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854932.18747: done with get_vars() 15406 1726854932.18754: done getting variables 15406 1726854932.18821: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Friday 20 September 2024 13:55:32 -0400 (0:00:00.011) 0:00:00.011 ****** 15406 1726854932.18841: entering _queue_task() for managed_node2/gather_facts 15406 1726854932.18842: Creating lock for gather_facts 15406 1726854932.19398: worker is 1 (out of 1 available) 15406 1726854932.19406: exiting _queue_task() for managed_node2/gather_facts 15406 1726854932.19417: done queuing things up, now waiting for results queue to drain 15406 1726854932.19418: waiting for pending results... 15406 1726854932.19523: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854932.19554: in run() - task 0affcc66-ac2b-3c83-32d3-00000000007e 15406 1726854932.19572: variable 'ansible_search_path' from source: unknown 15406 1726854932.19616: calling self._execute() 15406 1726854932.19678: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854932.19691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854932.19704: variable 'omit' from source: magic vars 15406 1726854932.19814: variable 'omit' from source: magic vars 15406 1726854932.19853: variable 'omit' from source: magic vars 15406 1726854932.19899: variable 'omit' from source: magic vars 15406 1726854932.19972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854932.19991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854932.20014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854932.20034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854932.20081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854932.20091: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854932.20099: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854932.20105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854932.20211: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854932.20222: Set connection var ansible_timeout to 10 15406 1726854932.20269: Set connection var ansible_connection to ssh 15406 1726854932.20273: Set connection var ansible_shell_type to sh 15406 1726854932.20275: Set connection var ansible_shell_executable to /bin/sh 15406 1726854932.20278: Set connection var ansible_pipelining to False 15406 1726854932.20319: variable 'ansible_shell_executable' from source: unknown 15406 1726854932.20328: variable 'ansible_connection' from source: unknown 15406 1726854932.20335: variable 'ansible_module_compression' from source: unknown 15406 1726854932.20341: variable 'ansible_shell_type' from source: unknown 15406 1726854932.20347: variable 'ansible_shell_executable' from source: unknown 15406 1726854932.20353: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854932.20360: variable 'ansible_pipelining' from source: unknown 15406 1726854932.20377: variable 'ansible_timeout' from source: unknown 15406 1726854932.20380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854932.20595: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854932.20599: variable 'omit' from source: magic vars 15406 1726854932.20601: starting attempt loop 15406 1726854932.20603: running the handler 15406 1726854932.20605: variable 'ansible_facts' from source: unknown 15406 1726854932.20626: _low_level_execute_command(): starting 15406 1726854932.20637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854932.21395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.21476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.21509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854932.21529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854932.21551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.22007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854932.23637: stdout chunk (state=3): >>>/root <<< 15406 1726854932.23896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854932.23899: stdout chunk (state=3): >>><<< 15406 1726854932.23901: stderr chunk (state=3): >>><<< 15406 1726854932.23904: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854932.23912: _low_level_execute_command(): starting 15406 1726854932.23922: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445 `" && echo ansible-tmp-1726854932.2387927-15443-101208329219445="` echo /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445 `" ) && sleep 0' 15406 1726854932.25779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.25783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.25786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.25790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.25999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854932.26066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.26286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854932.28334: stdout chunk (state=3): >>>ansible-tmp-1726854932.2387927-15443-101208329219445=/root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445 <<< 15406 1726854932.28338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854932.28455: stderr chunk (state=3): >>><<< 15406 1726854932.28459: stdout chunk (state=3): >>><<< 15406 1726854932.28630: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854932.2387927-15443-101208329219445=/root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854932.28645: variable 'ansible_module_compression' from source: unknown 15406 1726854932.28707: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15406 1726854932.28862: ANSIBALLZ: Acquiring lock 15406 1726854932.28872: ANSIBALLZ: Lock acquired: 140626835985552 15406 1726854932.28880: ANSIBALLZ: Creating module 15406 1726854932.74779: ANSIBALLZ: Writing module into payload 15406 1726854932.74939: ANSIBALLZ: Writing module 15406 1726854932.74960: ANSIBALLZ: Renaming module 15406 1726854932.74965: ANSIBALLZ: Done creating module 15406 1726854932.75015: variable 'ansible_facts' from source: unknown 15406 1726854932.75022: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854932.75033: _low_level_execute_command(): starting 15406 1726854932.75039: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15406 1726854932.75666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854932.75674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.75686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854932.75701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854932.75712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854932.75719: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854932.75727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.75743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854932.75821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854932.75835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.75993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15406 1726854932.78545: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15406 1726854932.78593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854932.78600: stdout chunk (state=3): >>><<< 15406 1726854932.78611: stderr chunk (state=3): >>><<< 15406 1726854932.78624: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15406 1726854932.78631 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15406 1726854932.78717: _low_level_execute_command(): starting 15406 1726854932.78721: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15406 1726854932.79192: Sending initial data 15406 1726854932.79198: Sent initial data (1181 bytes) 15406 1726854932.80154: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.80158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854932.80160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.80162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.80165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854932.80172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.80238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854932.80245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854932.80283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.80369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 15406 1726854932.84798: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15406 1726854932.84945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854932.85018: stderr chunk (state=3): >>><<< 15406 1726854932.85023: stdout chunk (state=3): >>><<< 15406 1726854932.85045: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 15406 1726854932.85147: variable 'ansible_facts' from source: unknown 15406 1726854932.85150: variable 'ansible_facts' from source: unknown 15406 1726854932.85159: variable 'ansible_module_compression' from source: unknown 15406 1726854932.85213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854932.85248: variable 'ansible_facts' from source: unknown 15406 1726854932.85467: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py 15406 1726854932.85716: Sending initial data 15406 1726854932.85719: Sent initial data (154 bytes) 15406 1726854932.86243: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854932.86256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.86273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854932.86385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854932.86403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854932.86418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.86536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 15406 1726854932.88369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854932.88432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854932.88542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpl4zl1cp9 /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py <<< 15406 1726854932.88552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py" <<< 15406 1726854932.88605: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpl4zl1cp9" to remote "/root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py" <<< 15406 1726854932.90798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854932.90890: stderr chunk (state=3): >>><<< 15406 1726854932.90897: stdout chunk (state=3): >>><<< 15406 1726854932.90921: done transferring module to remote 15406 1726854932.90937: _low_level_execute_command(): starting 15406 1726854932.90940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/ /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py && sleep 0' 15406 1726854932.91548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854932.91573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854932.91576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854932.91584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854932.91645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854932.91648: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854932.91650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854932.91652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854932.91707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854932.91756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.91827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 15406 1726854932.94686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854932.94693: stdout chunk (state=3): >>><<< 15406 1726854932.94696: stderr chunk (state=3): >>><<< 15406 1726854932.94698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 15406 1726854932.94701: _low_level_execute_command(): starting 15406 1726854932.94703: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/AnsiballZ_setup.py && sleep 0' 15406 1726854932.95623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854932.95891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854932.95920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854932.96033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 15406 1726854932.99307: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15406 1726854932.99361: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15406 1726854932.99425: stdout chunk (state=3): >>>import 'posix' # <<< 15406 1726854932.99480: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 15406 1726854932.99534: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # <<< 15406 1726854932.99610: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15406 1726854932.99693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854932.99714: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 15406 1726854932.99753: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 15406 1726854932.99803: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15406 1726854932.99825: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9569e7b30> <<< 15406 1726854932.99869: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 15406 1726854932.99905: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956a1aa50> <<< 15406 1726854932.99993: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 15406 1726854932.99996: stdout chunk (state=3): >>> import 'abc' # <<< 15406 1726854933.00019: stdout chunk (state=3): >>> import 'io' # <<< 15406 1726854933.00055: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15406 1726854933.00393: stdout chunk (state=3): >>> import '_collections_abc' # <<< 15406 1726854933.00398: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15406 1726854933.00400: stdout chunk (state=3): >>> <<< 15406 1726854933.00403: stdout chunk (state=3): >>>import 'os' # <<< 15406 1726854933.00405: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15406 1726854933.00493: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 15406 1726854933.00502: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15406 1726854933.00512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956809130><<< 15406 1726854933.00631: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854933.00993: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956809fa0> <<< 15406 1726854933.00997: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15406 1726854933.01384: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15406 1726854933.01471: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15406 1726854933.01522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 15406 1726854933.01536: stdout chunk (state=3): >>> <<< 15406 1726854933.01585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 15406 1726854933.01608: stdout chunk (state=3): >>> <<< 15406 1726854933.01636: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956847dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 15406 1726854933.01685: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15406 1726854933.01770: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956847fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15406 1726854933.01842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15406 1726854933.01913: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854933.01947: stdout chunk (state=3): >>>import 'itertools' # <<< 15406 1726854933.01983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 15406 1726854933.02124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95687f800> <<< 15406 1726854933.02183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95687fe90> import '_collections' # <<< 15406 1726854933.02198: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95685faa0> import '_functools' # <<< 15406 1726854933.02448: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95685d1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956844f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 15406 1726854933.02470: stdout chunk (state=3): >>> <<< 15406 1726854933.02506: stdout chunk (state=3): >>>import '_sre' # <<< 15406 1726854933.02713: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15406 1726854933.02742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95689f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95689e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 15406 1726854933.02754: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95685e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956846e70><<< 15406 1726854933.02894: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15406 1726854933.02899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d47a0> <<< 15406 1726854933.02901: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956844200> <<< 15406 1726854933.02975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854933.03170: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568d4c50> <<< 15406 1726854933.03199: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d4b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568d4ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956842d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 15406 1726854933.03343: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d55b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d5280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d64b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15406 1726854933.03440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15406 1726854933.03643: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568ec680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568edd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15406 1726854933.03663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15406 1726854933.03681: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568eebd0> <<< 15406 1726854933.03745: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854933.03772: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568ef230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568ee120><<< 15406 1726854933.03814: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15406 1726854933.03874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15406 1726854933.04137: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568efcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568ef3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d6450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 15406 1726854933.04282: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9565ebbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 15406 1726854933.04410: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa956614710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956614470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854933.04413: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9566146b0><<< 15406 1726854933.04453: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15406 1726854933.04476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15406 1726854933.04729: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854933.04858: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854933.04861: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa956614fe0> <<< 15406 1726854933.05108: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa956615910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956614890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9565e9d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15406 1726854933.05141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15406 1726854933.05183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15406 1726854933.05211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15406 1726854933.05232: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956616cc0><<< 15406 1726854933.05393: stdout chunk (state=3): >>> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956615790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d6ba0> <<< 15406 1726854933.05549: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 15406 1726854933.05563: stdout chunk (state=3): >>> <<< 15406 1726854933.05610: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956643020> <<< 15406 1726854933.05795: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15406 1726854933.05799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15406 1726854933.05801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 15406 1726854933.05812: stdout chunk (state=3): >>> <<< 15406 1726854933.06301: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566633e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566c4200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15406 1726854933.06600: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566c6960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566c4320> <<< 15406 1726854933.06622: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566911f0> <<< 15406 1726854933.07162: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566621e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956617bf0> <<< 15406 1726854933.07165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 15406 1726854933.07168: stdout chunk (state=3): >>> <<< 15406 1726854933.07170: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa956662300><<< 15406 1726854933.07172: stdout chunk (state=3): >>> <<< 15406 1726854933.07442: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_jzq9l6nu/ansible_ansible.legacy.setup_payload.zip'<<< 15406 1726854933.07463: stdout chunk (state=3): >>> <<< 15406 1726854933.07477: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.07692: stdout chunk (state=3): >>># zipimport: zlib available<<< 15406 1726854933.07704: stdout chunk (state=3): >>> <<< 15406 1726854933.07725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 15406 1726854933.07832: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 15406 1726854933.07936: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15406 1726854933.07995: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 15406 1726854933.08005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 15406 1726854933.08216: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f8af90> <<< 15406 1726854933.08219: stdout chunk (state=3): >>>import '_typing' # <<< 15406 1726854933.08342: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f69e80><<< 15406 1726854933.08371: stdout chunk (state=3): >>> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f69040> <<< 15406 1726854933.08396: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.08440: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15406 1726854933.08481: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 15406 1726854933.08514: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 15406 1726854933.08808: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.10783: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.12670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15406 1726854933.12701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f88e60> <<< 15406 1726854933.12738: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py<<< 15406 1726854933.12877: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 15406 1726854933.12905: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15406 1726854933.12924: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854933.13000: stdout chunk (state=3): >>> import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955fbe930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbe720> <<< 15406 1726854933.13055: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbe030> <<< 15406 1726854933.13091: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15406 1726854933.13117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 15406 1726854933.13180: stdout chunk (state=3): >>> <<< 15406 1726854933.13222: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbea50><<< 15406 1726854933.13225: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f8bc20> import 'atexit' # <<< 15406 1726854933.13604: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955fbf680><<< 15406 1726854933.13611: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955fbf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbfe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15406 1726854933.13701: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e25b80> <<< 15406 1726854933.13706: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854933.13712: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854933.13802: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e277a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15406 1726854933.13822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e281a0> <<< 15406 1726854933.13902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15406 1726854933.13909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e29340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15406 1726854933.14001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15406 1726854933.14053: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e2bdd0> <<< 15406 1726854933.14200: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e28110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e2a0f0> <<< 15406 1726854933.14494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15406 1726854933.14516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e33c80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e32750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e324b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15406 1726854933.14522: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e32a20> <<< 15406 1726854933.14554: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e2a5a0> <<< 15406 1726854933.14604: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e77f20> <<< 15406 1726854933.14698: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e780b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15406 1726854933.14716: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e79b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e79940> <<< 15406 1726854933.14722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15406 1726854933.14761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15406 1726854933.14808: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e7c110> <<< 15406 1726854933.14818: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e7a270> <<< 15406 1726854933.14839: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15406 1726854933.14898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854933.14913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15406 1726854933.15069: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e7f860> <<< 15406 1726854933.15076: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e7c230> <<< 15406 1726854933.15165: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e80890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e80a40> <<< 15406 1726854933.15506: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e80920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e78290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d0c0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d0d430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e82870> <<< 15406 1726854933.15559: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e83c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e824e0> <<< 15406 1726854933.15562: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.15693: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15406 1726854933.15712: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.15763: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.15773: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.15776: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 15406 1726854933.15813: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15406 1726854933.16050: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.16057: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.16063: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.16804: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.17425: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15406 1726854933.17480: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15406 1726854933.17573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d11610> <<< 15406 1726854933.17684: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d12360> <<< 15406 1726854933.17691: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d0d5e0> <<< 15406 1726854933.17749: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 15406 1726854933.17792: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 15406 1726854933.17840: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.18019: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.18247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15406 1726854933.18263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15406 1726854933.18300: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d12ab0> # zipimport: zlib available <<< 15406 1726854933.19208: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.19501: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.19575: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.19660: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15406 1726854933.19676: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.19713: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.19874: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.19916: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15406 1726854933.20305: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15406 1726854933.20309: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.20504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15406 1726854933.20703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d136b0> # zipimport: zlib available <<< 15406 1726854933.20745: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.20786: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15406 1726854933.20796: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 15406 1726854933.20804: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15406 1726854933.20903: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.20966: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.21000: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.21403: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d1e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d19040> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15406 1726854933.21406: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.21429: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.21512: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.21516: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854933.21518: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15406 1726854933.21550: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15406 1726854933.21606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15406 1726854933.21628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15406 1726854933.21636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15406 1726854933.21657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15406 1726854933.21899: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e06bd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955efe8d0> <<< 15406 1726854933.21932: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d1e3c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d1e180> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15406 1726854933.21957: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15406 1726854933.21964: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.21989: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 15406 1726854933.21998: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.22049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.22113: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.22145: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.22364: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15406 1726854933.22710: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 15406 1726854933.22714: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.22927: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.23320: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.23327: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15406 1726854933.23349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15406 1726854933.23370: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15406 1726854933.23418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15406 1726854933.23494: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15406 1726854933.23571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15406 1726854933.23592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15406 1726854933.23598: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1c260> <<< 15406 1726854933.23627: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854933.23646: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a1c5c0> <<< 15406 1726854933.23805: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d98590> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db2ea0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db09e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db0620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15406 1726854933.23877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15406 1726854933.23913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15406 1726854933.23937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15406 1726854933.23978: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a1f5f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1eea0> <<< 15406 1726854933.24057: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a1f080> <<< 15406 1726854933.24060: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1e300> <<< 15406 1726854933.24063: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15406 1726854933.24211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1f6e0> <<< 15406 1726854933.24270: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15406 1726854933.24304: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a821e0> <<< 15406 1726854933.24370: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a80200> <<< 15406 1726854933.24376: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db0530> import 'ansible.module_utils.facts.timeout' # <<< 15406 1726854933.24415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15406 1726854933.24454: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15406 1726854933.24521: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.24673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 15406 1726854933.24679: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.24749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15406 1726854933.24762: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.24793: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15406 1726854933.24888: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.24910: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 15406 1726854933.24936: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.24964: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.25028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15406 1726854933.25032: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.25107: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.25170: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 15406 1726854933.25331: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.25695: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15406 1726854933.25951: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.26476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15406 1726854933.26484: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.26512: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.26526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.26600: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.26634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15406 1726854933.26640: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.26795: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.26825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 15406 1726854933.26903: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.26959: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15406 1726854933.26981: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.27072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15406 1726854933.27105: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a83710> <<< 15406 1726854933.27144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15406 1726854933.27152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15406 1726854933.27265: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a82cf0> <<< 15406 1726854933.27269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15406 1726854933.27411: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # <<< 15406 1726854933.27496: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.27555: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.27594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15406 1726854933.27607: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.27723: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.27731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15406 1726854933.27900: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15406 1726854933.28000: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854933.28183: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955abe2a0> <<< 15406 1726854933.28195: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a83a40> import 'ansible.module_utils.facts.system.python' # <<< 15406 1726854933.28203: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28257: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 15406 1726854933.28402: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28485: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28598: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28792: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 15406 1726854933.28801: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28830: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15406 1726854933.28851: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.28897: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955ad1e50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955ad36b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.29023: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 15406 1726854933.29030: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29419: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.29423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15406 1726854933.29525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29668: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29770: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.29777: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.29972: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.30053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15406 1726854933.30096: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.30401: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.30597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.30974: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.31433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15406 1726854933.31470: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.31553: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.31694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 15406 1726854933.31763: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.31942: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15406 1726854933.32023: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.32181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15406 1726854933.32190: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.32259: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15406 1726854933.32302: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.32309: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 15406 1726854933.32319: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.32412: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.32558: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.32729: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.33025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 15406 1726854933.33329: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.33443: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15406 1726854933.33508: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 15406 1726854933.33526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.33910: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15406 1726854933.34049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34103: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34161: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15406 1726854933.34173: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34206: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15406 1726854933.34259: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 15406 1726854933.34538: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34571: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15406 1726854933.34613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34649: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.34784: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34874: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34903: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.34994: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 15406 1726854933.35186: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.35275: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15406 1726854933.35308: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.35497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15406 1726854933.35544: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.35594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15406 1726854933.35645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854933.35711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15406 1726854933.35785: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.35874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15406 1726854933.35893: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.35969: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.36081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15406 1726854933.36145: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854933.36742: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15406 1726854933.36806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15406 1726854933.36831: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95586a9f0><<< 15406 1726854933.36952: stdout chunk (state=3): >>> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955869130> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbe7e0> <<< 15406 1726854933.52318: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 15406 1726854933.52356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9558b0e30> <<< 15406 1726854933.52383: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 15406 1726854933.52438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9558b1be0> <<< 15406 1726854933.52601: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854933.52604: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955ac03e0> <<< 15406 1726854933.52615: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9558b3da0> <<< 15406 1726854933.52783: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15406 1726854933.73663: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "<<< 15406 1726854933.73703: stdout chunk (state=3): >>>tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macadd<<< 15406 1726854933.73713: stdout chunk (state=3): >>>ress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "33", "epoch": "1726854933", "epoch_int": "1726854933", "date": "2024-09-20", "time": "13:55:33", "iso8601_micro": "2024-09-20T17:55:33.407214Z", "iso8601": "2024-09-20T17:55:33Z", "iso8601_basic": "20240920T135533407214", "iso8601_basic_short": "20240920T135533", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 716, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797011456, "block_size": 4096, "block_total": 65519099, "block_available": 63915286, "block_used": 1603813, "inode_total": 131070960, "inode_available": 131029049, "inode_used": 41911, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/p<<< 15406 1726854933.73739: stdout chunk (state=3): >>>ts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.544921875, "5m": 0.3798828125, "15m": 0.18359375}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854933.74402: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 15406 1726854933.74421: stdout chunk (state=3): >>> <<< 15406 1726854933.74440: stdout chunk (state=3): >>># clear sys.path_hooks <<< 15406 1726854933.74443: stdout chunk (state=3): >>># clear builtins._<<< 15406 1726854933.74454: stdout chunk (state=3): >>> <<< 15406 1726854933.74469: stdout chunk (state=3): >>># clear sys.path<<< 15406 1726854933.74475: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__<<< 15406 1726854933.74497: stdout chunk (state=3): >>> # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 15406 1726854933.74517: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings<<< 15406 1726854933.74541: stdout chunk (state=3): >>> # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path<<< 15406 1726854933.74573: stdout chunk (state=3): >>> # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 15406 1726854933.74593: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants<<< 15406 1726854933.74614: stdout chunk (state=3): >>> # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external<<< 15406 1726854933.74631: stdout chunk (state=3): >>> # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2<<< 15406 1726854933.74658: stdout chunk (state=3): >>> # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset<<< 15406 1726854933.74685: stdout chunk (state=3): >>> # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path<<< 15406 1726854933.74709: stdout chunk (state=3): >>> # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils<<< 15406 1726854933.74731: stdout chunk (state=3): >>> # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl <<< 15406 1726854933.74757: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap<<< 15406 1726854933.74784: stdout chunk (state=3): >>> # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 15406 1726854933.74809: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six<<< 15406 1726854933.74839: stdout chunk (state=3): >>> # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy<<< 15406 1726854933.74862: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 15406 1726854933.75052: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup<<< 15406 1726854933.75083: stdout chunk (state=3): >>>[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.fa<<< 15406 1726854933.75085: stdout chunk (state=3): >>>cts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15406 1726854933.75468: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15406 1726854933.75494: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15406 1726854933.75513: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 15406 1726854933.75541: stdout chunk (state=3): >>># destroy _blake2 <<< 15406 1726854933.75552: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 15406 1726854933.75567: stdout chunk (state=3): >>># destroy zipfile._path <<< 15406 1726854933.75577: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 15406 1726854933.75590: stdout chunk (state=3): >>># destroy ipaddress <<< 15406 1726854933.75618: stdout chunk (state=3): >>># destroy ntpath <<< 15406 1726854933.75651: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 15406 1726854933.75675: stdout chunk (state=3): >>># destroy json.scanner # destroy _json <<< 15406 1726854933.75684: stdout chunk (state=3): >>># destroy grp # destroy encodings <<< 15406 1726854933.75714: stdout chunk (state=3): >>># destroy _locale <<< 15406 1726854933.75720: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 15406 1726854933.75943: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 15406 1726854933.75975: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 15406 1726854933.75985: stdout chunk (state=3): >>># destroy termios <<< 15406 1726854933.75998: stdout chunk (state=3): >>># destroy json <<< 15406 1726854933.76047: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 15406 1726854933.76061: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 15406 1726854933.76074: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context<<< 15406 1726854933.76088: stdout chunk (state=3): >>> # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 15406 1726854933.76101: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection<<< 15406 1726854933.76105: stdout chunk (state=3): >>> <<< 15406 1726854933.76144: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 15406 1726854933.76164: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 15406 1726854933.76178: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon <<< 15406 1726854933.76191: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 15406 1726854933.76204: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 15406 1726854933.76216: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 15406 1726854933.76226: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 15406 1726854933.76240: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 15406 1726854933.76254: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 15406 1726854933.76438: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 15406 1726854933.76444: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15406 1726854933.76533: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15406 1726854933.76539: stdout chunk (state=3): >>># destroy _socket <<< 15406 1726854933.76559: stdout chunk (state=3): >>># destroy _collections <<< 15406 1726854933.76606: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 15406 1726854933.76609: stdout chunk (state=3): >>># destroy tokenize <<< 15406 1726854933.76644: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 15406 1726854933.76650: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 15406 1726854933.76677: stdout chunk (state=3): >>># destroy _typing <<< 15406 1726854933.76686: stdout chunk (state=3): >>># destroy _tokenize <<< 15406 1726854933.76850: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15406 1726854933.76876: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 15406 1726854933.76880: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 15406 1726854933.76888: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect <<< 15406 1726854933.76910: stdout chunk (state=3): >>># destroy time <<< 15406 1726854933.76933: stdout chunk (state=3): >>># destroy _random <<< 15406 1726854933.76939: stdout chunk (state=3): >>># destroy _weakref <<< 15406 1726854933.76955: stdout chunk (state=3): >>># destroy _hashlib <<< 15406 1726854933.76974: stdout chunk (state=3): >>># destroy _operator <<< 15406 1726854933.76980: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re <<< 15406 1726854933.77024: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 15406 1726854933.77029: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 15406 1726854933.77037: stdout chunk (state=3): >>> <<< 15406 1726854933.77045: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15406 1726854933.77599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854933.77631: stderr chunk (state=3): >>><<< 15406 1726854933.77634: stdout chunk (state=3): >>><<< 15406 1726854933.77745: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9569e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956a1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956809130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956809fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956847dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956847fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95687f800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95687fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95685faa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95685d1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956844f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95689f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95689e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95685e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956846e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d47a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956844200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568d4c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d4b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568d4ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956842d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d55b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d5280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d64b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568ec680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568edd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568eebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568ef230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568ee120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9568efcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568ef3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d6450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9565ebbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa956614710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956614470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9566146b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa956614fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa956615910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956614890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9565e9d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956616cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956615790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9568d6ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956643020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566633e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566c4200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566c6960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566c4320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566911f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9566621e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa956617bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa956662300> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_jzq9l6nu/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f8af90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f69e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f69040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f88e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955fbe930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbe720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbe030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbea50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955f8bc20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955fbf680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955fbf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbfe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e25b80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e277a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e281a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e29340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e2bdd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e28110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e2a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e33c80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e32750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e324b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e32a20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e2a5a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e77f20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e780b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e79b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e79940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e7c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e7a270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e7f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e7c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e80890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e80a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e80920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e78290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d0c0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d0d430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e82870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955e83c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e824e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d11610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d12360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d0d5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d12ab0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d136b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955d1e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d19040> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955e06bd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955efe8d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d1e3c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d1e180> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1c260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a1c5c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955d98590> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db2ea0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db09e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db0620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a1f5f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1eea0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a1f080> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1e300> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a1f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955a821e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a80200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955db0530> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a83710> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a82cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955abe2a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955a83a40> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa955ad1e50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955ad36b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95586a9f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955869130> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955fbe7e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9558b0e30> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9558b1be0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa955ac03e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9558b3da0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "33", "epoch": "1726854933", "epoch_int": "1726854933", "date": "2024-09-20", "time": "13:55:33", "iso8601_micro": "2024-09-20T17:55:33.407214Z", "iso8601": "2024-09-20T17:55:33Z", "iso8601_basic": "20240920T135533407214", "iso8601_basic_short": "20240920T135533", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 716, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797011456, "block_size": 4096, "block_total": 65519099, "block_available": 63915286, "block_used": 1603813, "inode_total": 131070960, "inode_available": 131029049, "inode_used": 41911, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.544921875, "5m": 0.3798828125, "15m": 0.18359375}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15406 1726854933.78834: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854933.78854: _low_level_execute_command(): starting 15406 1726854933.78859: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854932.2387927-15443-101208329219445/ > /dev/null 2>&1 && sleep 0' 15406 1726854933.79323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854933.79326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854933.79328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854933.79331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854933.79333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854933.79401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854933.79405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854933.79507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854933.81900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854933.81917: stderr chunk (state=3): >>><<< 15406 1726854933.81920: stdout chunk (state=3): >>><<< 15406 1726854933.81933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854933.81940: handler run complete 15406 1726854933.82017: variable 'ansible_facts' from source: unknown 15406 1726854933.82074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.82262: variable 'ansible_facts' from source: unknown 15406 1726854933.82315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.82424: attempt loop complete, returning result 15406 1726854933.82427: _execute() done 15406 1726854933.82432: dumping result to json 15406 1726854933.82492: done dumping result, returning 15406 1726854933.82496: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-00000000007e] 15406 1726854933.82498: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000007e 15406 1726854933.83237: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000007e 15406 1726854933.83240: WORKER PROCESS EXITING ok: [managed_node2] 15406 1726854933.83681: no more pending results, returning what we have 15406 1726854933.83684: results queue empty 15406 1726854933.83685: checking for any_errors_fatal 15406 1726854933.83686: done checking for any_errors_fatal 15406 1726854933.83689: checking for max_fail_percentage 15406 1726854933.83692: done checking for max_fail_percentage 15406 1726854933.83692: checking to see if all hosts have failed and the running result is not ok 15406 1726854933.83693: done checking to see if all hosts have failed 15406 1726854933.83694: getting the remaining hosts for this loop 15406 1726854933.83695: done getting the remaining hosts for this loop 15406 1726854933.83698: getting the next task for host managed_node2 15406 1726854933.83704: done getting next task for host managed_node2 15406 1726854933.83705: ^ task is: TASK: meta (flush_handlers) 15406 1726854933.83707: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854933.83710: getting variables 15406 1726854933.83711: in VariableManager get_vars() 15406 1726854933.83730: Calling all_inventory to load vars for managed_node2 15406 1726854933.83732: Calling groups_inventory to load vars for managed_node2 15406 1726854933.83735: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854933.83744: Calling all_plugins_play to load vars for managed_node2 15406 1726854933.83746: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854933.83748: Calling groups_plugins_play to load vars for managed_node2 15406 1726854933.83925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.84141: done with get_vars() 15406 1726854933.84152: done getting variables 15406 1726854933.84226: in VariableManager get_vars() 15406 1726854933.84236: Calling all_inventory to load vars for managed_node2 15406 1726854933.84239: Calling groups_inventory to load vars for managed_node2 15406 1726854933.84241: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854933.84246: Calling all_plugins_play to load vars for managed_node2 15406 1726854933.84248: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854933.84251: Calling groups_plugins_play to load vars for managed_node2 15406 1726854933.84393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.84576: done with get_vars() 15406 1726854933.84591: done queuing things up, now waiting for results queue to drain 15406 1726854933.84593: results queue empty 15406 1726854933.84594: checking for any_errors_fatal 15406 1726854933.84596: done checking for any_errors_fatal 15406 1726854933.84597: checking for max_fail_percentage 15406 1726854933.84602: done checking for max_fail_percentage 15406 1726854933.84603: checking to see if all hosts have failed and the running result is not ok 15406 1726854933.84604: done checking to see if all hosts have failed 15406 1726854933.84605: getting the remaining hosts for this loop 15406 1726854933.84606: done getting the remaining hosts for this loop 15406 1726854933.84608: getting the next task for host managed_node2 15406 1726854933.84612: done getting next task for host managed_node2 15406 1726854933.84615: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15406 1726854933.84616: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854933.84618: getting variables 15406 1726854933.84619: in VariableManager get_vars() 15406 1726854933.84632: Calling all_inventory to load vars for managed_node2 15406 1726854933.84635: Calling groups_inventory to load vars for managed_node2 15406 1726854933.84637: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854933.84642: Calling all_plugins_play to load vars for managed_node2 15406 1726854933.84644: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854933.84647: Calling groups_plugins_play to load vars for managed_node2 15406 1726854933.84780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.84985: done with get_vars() 15406 1726854933.84996: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Friday 20 September 2024 13:55:33 -0400 (0:00:01.662) 0:00:01.673 ****** 15406 1726854933.85072: entering _queue_task() for managed_node2/include_tasks 15406 1726854933.85073: Creating lock for include_tasks 15406 1726854933.85430: worker is 1 (out of 1 available) 15406 1726854933.85442: exiting _queue_task() for managed_node2/include_tasks 15406 1726854933.85452: done queuing things up, now waiting for results queue to drain 15406 1726854933.85454: waiting for pending results... 15406 1726854933.85647: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 15406 1726854933.85711: in run() - task 0affcc66-ac2b-3c83-32d3-000000000006 15406 1726854933.85724: variable 'ansible_search_path' from source: unknown 15406 1726854933.85751: calling self._execute() 15406 1726854933.85817: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854933.85821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854933.85830: variable 'omit' from source: magic vars 15406 1726854933.85905: _execute() done 15406 1726854933.85908: dumping result to json 15406 1726854933.85911: done dumping result, returning 15406 1726854933.85915: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affcc66-ac2b-3c83-32d3-000000000006] 15406 1726854933.85920: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000006 15406 1726854933.86008: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000006 15406 1726854933.86010: WORKER PROCESS EXITING 15406 1726854933.86050: no more pending results, returning what we have 15406 1726854933.86054: in VariableManager get_vars() 15406 1726854933.86083: Calling all_inventory to load vars for managed_node2 15406 1726854933.86085: Calling groups_inventory to load vars for managed_node2 15406 1726854933.86090: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854933.86101: Calling all_plugins_play to load vars for managed_node2 15406 1726854933.86104: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854933.86106: Calling groups_plugins_play to load vars for managed_node2 15406 1726854933.86225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.86330: done with get_vars() 15406 1726854933.86335: variable 'ansible_search_path' from source: unknown 15406 1726854933.86344: we have included files to process 15406 1726854933.86345: generating all_blocks data 15406 1726854933.86346: done generating all_blocks data 15406 1726854933.86346: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15406 1726854933.86347: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15406 1726854933.86349: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15406 1726854933.86773: in VariableManager get_vars() 15406 1726854933.86783: done with get_vars() 15406 1726854933.86792: done processing included file 15406 1726854933.86793: iterating over new_blocks loaded from include file 15406 1726854933.86794: in VariableManager get_vars() 15406 1726854933.86800: done with get_vars() 15406 1726854933.86801: filtering new block on tags 15406 1726854933.86810: done filtering new block on tags 15406 1726854933.86811: in VariableManager get_vars() 15406 1726854933.86817: done with get_vars() 15406 1726854933.86817: filtering new block on tags 15406 1726854933.86829: done filtering new block on tags 15406 1726854933.86830: in VariableManager get_vars() 15406 1726854933.86836: done with get_vars() 15406 1726854933.86837: filtering new block on tags 15406 1726854933.86844: done filtering new block on tags 15406 1726854933.86845: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 15406 1726854933.86849: extending task lists for all hosts with included blocks 15406 1726854933.86876: done extending task lists 15406 1726854933.86877: done processing included files 15406 1726854933.86877: results queue empty 15406 1726854933.86877: checking for any_errors_fatal 15406 1726854933.86878: done checking for any_errors_fatal 15406 1726854933.86879: checking for max_fail_percentage 15406 1726854933.86880: done checking for max_fail_percentage 15406 1726854933.86880: checking to see if all hosts have failed and the running result is not ok 15406 1726854933.86881: done checking to see if all hosts have failed 15406 1726854933.86881: getting the remaining hosts for this loop 15406 1726854933.86882: done getting the remaining hosts for this loop 15406 1726854933.86883: getting the next task for host managed_node2 15406 1726854933.86886: done getting next task for host managed_node2 15406 1726854933.86889: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15406 1726854933.86891: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854933.86892: getting variables 15406 1726854933.86893: in VariableManager get_vars() 15406 1726854933.86898: Calling all_inventory to load vars for managed_node2 15406 1726854933.86899: Calling groups_inventory to load vars for managed_node2 15406 1726854933.86900: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854933.86904: Calling all_plugins_play to load vars for managed_node2 15406 1726854933.86905: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854933.86906: Calling groups_plugins_play to load vars for managed_node2 15406 1726854933.86998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854933.87103: done with get_vars() 15406 1726854933.87109: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:55:33 -0400 (0:00:00.020) 0:00:01.694 ****** 15406 1726854933.87150: entering _queue_task() for managed_node2/setup 15406 1726854933.87343: worker is 1 (out of 1 available) 15406 1726854933.87353: exiting _queue_task() for managed_node2/setup 15406 1726854933.87364: done queuing things up, now waiting for results queue to drain 15406 1726854933.87366: waiting for pending results... 15406 1726854933.87582: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 15406 1726854933.87693: in run() - task 0affcc66-ac2b-3c83-32d3-00000000008f 15406 1726854933.87697: variable 'ansible_search_path' from source: unknown 15406 1726854933.87699: variable 'ansible_search_path' from source: unknown 15406 1726854933.87701: calling self._execute() 15406 1726854933.87732: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854933.87744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854933.87757: variable 'omit' from source: magic vars 15406 1726854933.88244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854933.89673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854933.89719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854933.89746: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854933.89781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854933.89804: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854933.89860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854933.89884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854933.89903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854933.89929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854933.89939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854933.90054: variable 'ansible_facts' from source: unknown 15406 1726854933.90100: variable 'network_test_required_facts' from source: task vars 15406 1726854933.90127: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15406 1726854933.90130: variable 'omit' from source: magic vars 15406 1726854933.90155: variable 'omit' from source: magic vars 15406 1726854933.90175: variable 'omit' from source: magic vars 15406 1726854933.90293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854933.90296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854933.90299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854933.90301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854933.90304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854933.90307: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854933.90310: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854933.90311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854933.90356: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854933.90364: Set connection var ansible_timeout to 10 15406 1726854933.90370: Set connection var ansible_connection to ssh 15406 1726854933.90376: Set connection var ansible_shell_type to sh 15406 1726854933.90381: Set connection var ansible_shell_executable to /bin/sh 15406 1726854933.90384: Set connection var ansible_pipelining to False 15406 1726854933.90463: variable 'ansible_shell_executable' from source: unknown 15406 1726854933.90467: variable 'ansible_connection' from source: unknown 15406 1726854933.90469: variable 'ansible_module_compression' from source: unknown 15406 1726854933.90471: variable 'ansible_shell_type' from source: unknown 15406 1726854933.90474: variable 'ansible_shell_executable' from source: unknown 15406 1726854933.90476: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854933.90481: variable 'ansible_pipelining' from source: unknown 15406 1726854933.90484: variable 'ansible_timeout' from source: unknown 15406 1726854933.90486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854933.90605: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854933.90609: variable 'omit' from source: magic vars 15406 1726854933.90611: starting attempt loop 15406 1726854933.90614: running the handler 15406 1726854933.90616: _low_level_execute_command(): starting 15406 1726854933.90692: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854933.91333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854933.91351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854933.91376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854933.91493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854933.91508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854933.91608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854933.93207: stdout chunk (state=3): >>>/root <<< 15406 1726854933.93353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854933.93371: stderr chunk (state=3): >>><<< 15406 1726854933.93380: stdout chunk (state=3): >>><<< 15406 1726854933.93471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854933.93482: _low_level_execute_command(): starting 15406 1726854933.93486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080 `" && echo ansible-tmp-1726854933.9341307-15529-143904489761080="` echo /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080 `" ) && sleep 0' 15406 1726854933.94107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854933.94162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854933.94167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854933.94180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854933.94239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854933.94290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854933.94308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854933.94327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854933.94430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854933.96299: stdout chunk (state=3): >>>ansible-tmp-1726854933.9341307-15529-143904489761080=/root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080 <<< 15406 1726854933.96595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854933.96599: stdout chunk (state=3): >>><<< 15406 1726854933.96601: stderr chunk (state=3): >>><<< 15406 1726854933.96604: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854933.9341307-15529-143904489761080=/root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854933.96606: variable 'ansible_module_compression' from source: unknown 15406 1726854933.96609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854933.96653: variable 'ansible_facts' from source: unknown 15406 1726854933.96883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py 15406 1726854933.97055: Sending initial data 15406 1726854933.97066: Sent initial data (154 bytes) 15406 1726854933.97742: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854933.97813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854933.97870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854933.97891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854933.97964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854933.98040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854933.99620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854933.99922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854933.99960: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpfq5tata3 /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py <<< 15406 1726854933.99963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py" <<< 15406 1726854934.00017: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpfq5tata3" to remote "/root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py" <<< 15406 1726854934.01872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854934.01945: stderr chunk (state=3): >>><<< 15406 1726854934.01963: stdout chunk (state=3): >>><<< 15406 1726854934.01996: done transferring module to remote 15406 1726854934.02015: _low_level_execute_command(): starting 15406 1726854934.02024: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/ /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py && sleep 0' 15406 1726854934.02665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854934.02682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854934.02698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854934.02756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854934.02820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854934.02851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854934.02943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854934.04805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854934.04841: stderr chunk (state=3): >>><<< 15406 1726854934.04850: stdout chunk (state=3): >>><<< 15406 1726854934.04913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854934.04916: _low_level_execute_command(): starting 15406 1726854934.04918: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/AnsiballZ_setup.py && sleep 0' 15406 1726854934.05496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854934.05512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854934.05605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854934.05630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854934.05645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854934.05668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854934.05779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854934.08279: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15406 1726854934.08334: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15406 1726854934.08399: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15406 1726854934.08443: stdout chunk (state=3): >>>import 'posix' # <<< 15406 1726854934.08523: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15406 1726854934.08526: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15406 1726854934.08576: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.08579: stdout chunk (state=3): >>>import '_codecs' # <<< 15406 1726854934.08581: stdout chunk (state=3): >>>import 'codecs' # <<< 15406 1726854934.08729: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15406 1726854934.08778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c244184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c243e7b30> <<< 15406 1726854934.08804: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15406 1726854934.08807: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2441aa50> <<< 15406 1726854934.08809: stdout chunk (state=3): >>>import '_signal' # <<< 15406 1726854934.08812: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 15406 1726854934.08814: stdout chunk (state=3): >>>import 'io' # <<< 15406 1726854934.08816: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15406 1726854934.08903: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 15406 1726854934.08980: stdout chunk (state=3): >>>import 'os' # <<< 15406 1726854934.08988: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 15406 1726854934.09019: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15406 1726854934.09249: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15406 1726854934.09271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24209130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24209fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15406 1726854934.09814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15406 1726854934.09844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15406 1726854934.09903: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15406 1726854934.09907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.09945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 15406 1726854934.10010: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15406 1726854934.10043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15406 1726854934.10098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 15406 1726854934.10119: stdout chunk (state=3): >>> <<< 15406 1726854934.10147: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24247e90> <<< 15406 1726854934.10150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 15406 1726854934.10429: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24247f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.10447: stdout chunk (state=3): >>>import 'itertools' # <<< 15406 1726854934.10488: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 15406 1726854934.10510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2427f890><<< 15406 1726854934.10551: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15406 1726854934.10681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2427ff20> import '_collections' # <<< 15406 1726854934.10706: stdout chunk (state=3): >>> import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2425fb60> import '_functools' # <<< 15406 1726854934.10768: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2425d280> <<< 15406 1726854934.10916: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24245040> <<< 15406 1726854934.10959: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15406 1726854934.11125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15406 1726854934.11129: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 15406 1726854934.11214: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2429f800> <<< 15406 1726854934.11302: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2429e420> <<< 15406 1726854934.11310: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 15406 1726854934.11313: stdout chunk (state=3): >>> import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2425e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2429cc80><<< 15406 1726854934.11377: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15406 1726854934.11416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d4890> <<< 15406 1726854934.11550: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242442c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242d4d40> <<< 15406 1726854934.11601: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d4bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.11626: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242d4fe0><<< 15406 1726854934.11732: stdout chunk (state=3): >>> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24242de0><<< 15406 1726854934.11735: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.11856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 15406 1726854934.11926: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15406 1726854934.11930: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d56d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d53a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d65d0> import 'importlib.util' # import 'runpy' # <<< 15406 1726854934.12041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242ec7a0> <<< 15406 1726854934.12354: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242edeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242eed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242ef380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242ee2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242efe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242ef530> <<< 15406 1726854934.12358: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d6570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15406 1726854934.12376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15406 1726854934.12484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15406 1726854934.12553: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23fe3ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400c830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400c590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400c770> <<< 15406 1726854934.12618: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15406 1726854934.12632: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.12754: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400d100> <<< 15406 1726854934.13007: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400daf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400c9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23fe1e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15406 1726854934.13024: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400eea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400dc10> <<< 15406 1726854934.13139: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d6cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15406 1726854934.13172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15406 1726854934.13198: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24037230> <<< 15406 1726854934.13261: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15406 1726854934.13364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.13385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2405b590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15406 1726854934.13414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15406 1726854934.13482: stdout chunk (state=3): >>>import 'ntpath' # <<< 15406 1726854934.13886: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240bc2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240bea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240bc410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240813a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2405a3c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400fe00> <<< 15406 1726854934.14243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3c23925670> <<< 15406 1726854934.14550: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_k9wvds7b/ansible_setup_payload.zip'<<< 15406 1726854934.14641: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15406 1726854934.14774: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.14854: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15406 1726854934.14904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 15406 1726854934.14922: stdout chunk (state=3): >>> <<< 15406 1726854934.15107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2398f0e0><<< 15406 1726854934.15110: stdout chunk (state=3): >>> import '_typing' # <<< 15406 1726854934.15410: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2396dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2396d190> # zipimport: zlib available<<< 15406 1726854934.15445: stdout chunk (state=3): >>> import 'ansible' # <<< 15406 1726854934.15477: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.15553: stdout chunk (state=3): >>># zipimport: zlib available<<< 15406 1726854934.15556: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15406 1726854934.15559: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 15406 1726854934.15638: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.17740: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.19633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15406 1726854934.19637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2398d3d0> <<< 15406 1726854934.19773: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py<<< 15406 1726854934.19797: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15406 1726854934.19823: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854934.19893: stdout chunk (state=3): >>> import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c239beb70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239be900> <<< 15406 1726854934.20010: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239be210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 15406 1726854934.20048: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239be660> <<< 15406 1726854934.20073: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2398fd70> <<< 15406 1726854934.20100: stdout chunk (state=3): >>>import 'atexit' # <<< 15406 1726854934.20125: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854934.20228: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c239bf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c239bfad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15406 1726854934.20296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15406 1726854934.20321: stdout chunk (state=3): >>>import '_locale' # <<< 15406 1726854934.20395: stdout chunk (state=3): >>> import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239bffe0> <<< 15406 1726854934.20449: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15406 1726854934.20493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15406 1726854934.20553: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23829df0><<< 15406 1726854934.20586: stdout chunk (state=3): >>> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.20652: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.20674: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2382ba10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15406 1726854934.20916: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382d5b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 15406 1726854934.20971: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15406 1726854934.21112: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382ffb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c238343e0> <<< 15406 1726854934.21174: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382e330> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15406 1726854934.21243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15406 1726854934.21332: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15406 1726854934.21441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 15406 1726854934.21502: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 15406 1726854934.21505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 15406 1726854934.21689: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23837f50> <<< 15406 1726854934.21694: stdout chunk (state=3): >>>import '_tokenize' # <<< 15406 1726854934.21728: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23836a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c238367b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 15406 1726854934.21740: stdout chunk (state=3): >>> <<< 15406 1726854934.21823: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23836cf0> <<< 15406 1726854934.21858: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382e840> <<< 15406 1726854934.21902: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.21930: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.21972: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2387c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 15406 1726854934.22045: stdout chunk (state=3): >>> <<< 15406 1726854934.22085: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 15406 1726854934.22106: stdout chunk (state=3): >>> <<< 15406 1726854934.22159: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.22172: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2387dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387db50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15406 1726854934.22278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c238802f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387e480> <<< 15406 1726854934.22381: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.22384: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15406 1726854934.22481: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23883aa0> <<< 15406 1726854934.22625: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23880470> <<< 15406 1726854934.22744: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c238848f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23884ad0> <<< 15406 1726854934.22790: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23884e60> <<< 15406 1726854934.22827: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15406 1726854934.22868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15406 1726854934.23092: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.23116: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2370c3e0> <<< 15406 1726854934.23162: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2370d370> <<< 15406 1726854934.23184: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23886ba0> <<< 15406 1726854934.23236: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23887f50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c238867e0> # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.23249: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 15406 1726854934.23445: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.23511: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.23558: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 15406 1726854934.23573: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15406 1726854934.23902: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.23905: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.23931: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.24801: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.25776: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15406 1726854934.25780: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15406 1726854934.25807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23715610> <<< 15406 1726854934.25996: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23716360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2370d5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 15406 1726854934.26022: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 15406 1726854934.26248: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.26490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15406 1726854934.26570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237161b0> # zipimport: zlib available <<< 15406 1726854934.27334: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.27941: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.28040: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.28147: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15406 1726854934.28163: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.28210: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.28251: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15406 1726854934.28262: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.28577: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15406 1726854934.28581: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.28645: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15406 1726854934.28648: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.29003: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.29552: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23717530> # zipimport: zlib available <<< 15406 1726854934.29645: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.29797: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 15406 1726854934.29844: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.30316: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.30320: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.30350: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23722300> <<< 15406 1726854934.30403: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2371cd70> <<< 15406 1726854934.30447: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15406 1726854934.30665: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.30711: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.30737: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15406 1726854934.30781: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15406 1726854934.30798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15406 1726854934.30904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15406 1726854934.31002: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2380ac30> <<< 15406 1726854934.31108: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239ea900> <<< 15406 1726854934.31170: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23717290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15406 1726854934.31191: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31317: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31340: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15406 1726854934.31447: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15406 1726854934.31464: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31556: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31568: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31595: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31653: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31760: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.31874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15406 1726854934.31922: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.32239: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15406 1726854934.32368: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.32637: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.32750: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.32775: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854934.32802: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15406 1726854934.32833: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15406 1726854934.32884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15406 1726854934.32950: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b23f0> <<< 15406 1726854934.32962: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15406 1726854934.33007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15406 1726854934.33093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15406 1726854934.33104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c234102f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.33141: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23410620> <<< 15406 1726854934.33181: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23798e90> <<< 15406 1726854934.33285: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b2f30> <<< 15406 1726854934.33409: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b0a70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b05f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 15406 1726854934.33446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c234135f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23412ea0> <<< 15406 1726854934.33472: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23413050> <<< 15406 1726854934.33499: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c234122d0> <<< 15406 1726854934.33612: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15406 1726854934.33863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c234137a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2346e2a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2346c2c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b1b80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 15406 1726854934.33965: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15406 1726854934.34068: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34109: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15406 1726854934.34225: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15406 1726854934.34234: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34309: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15406 1726854934.34411: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34454: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15406 1726854934.34475: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15406 1726854934.34763: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.34857: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.34915: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15406 1726854934.35704: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.36366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15406 1726854934.36529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.36532: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.36568: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.36619: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 15406 1726854934.36855: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 15406 1726854934.36981: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.37058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15406 1726854934.37083: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.37142: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.37271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15406 1726854934.37400: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2346f5c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15406 1726854934.37538: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2346ee40> import 'ansible.module_utils.facts.system.local' # <<< 15406 1726854934.37747: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15406 1726854934.37936: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.38065: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 15406 1726854934.38111: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.38218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15406 1726854934.38236: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.38292: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.38409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15406 1726854934.38504: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854934.38629: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c234ae3f0> <<< 15406 1726854934.39021: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2349e1b0> import 'ansible.module_utils.facts.system.python' # <<< 15406 1726854934.39024: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.39053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15406 1726854934.39072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.39195: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.39311: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.39739: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 15406 1726854934.39769: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.39814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15406 1726854934.39835: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.39910: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.39948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15406 1726854934.40064: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c234c1eb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2349f3b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.40096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15406 1726854934.40250: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 15406 1726854934.40425: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.40636: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15406 1726854934.40654: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.40797: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.41026: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.41074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15406 1726854934.41106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 15406 1726854934.41130: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15406 1726854934.41228: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.41399: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.41617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 15406 1726854934.41684: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15406 1726854934.41846: stdout chunk (state=3): >>># zipimport: zlib available<<< 15406 1726854934.41994: stdout chunk (state=3): >>> <<< 15406 1726854934.42068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15406 1726854934.42318: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.42321: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.43072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.43883: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15406 1726854934.43903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 15406 1726854934.44096: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.44259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15406 1726854934.44284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.44440: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.44612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15406 1726854934.44643: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.44889: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.45134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15406 1726854934.45292: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.45320: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.45367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 15406 1726854934.45386: stdout chunk (state=3): >>># zipimport: zlib available<<< 15406 1726854934.45550: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15406 1726854934.45714: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.46043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.46426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 15406 1726854934.46501: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.46606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.46942: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 15406 1726854934.46946: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.46948: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 15406 1726854934.46970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.46997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15406 1726854934.47021: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.47181: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 15406 1726854934.47206: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.47292: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.47388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15406 1726854934.47413: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.47849: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.48291: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15406 1726854934.48295: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15406 1726854934.48392: stdout chunk (state=3): >>># zipimport: zlib available<<< 15406 1726854934.48442: stdout chunk (state=3): >>> <<< 15406 1726854934.48478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15406 1726854934.48510: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.48572: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.48629: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15406 1726854934.48695: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.48752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15406 1726854934.48773: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.48808: stdout chunk (state=3): >>># zipimport: zlib available<<< 15406 1726854934.48897: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.openbsd' # <<< 15406 1726854934.48939: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.49010: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.49143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15406 1726854934.49173: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15406 1726854934.49282: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15406 1726854934.49329: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # <<< 15406 1726854934.49402: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15406 1726854934.49428: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.49533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.49719: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854934.49845: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 15406 1726854934.49876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 15406 1726854934.49992: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.49995: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.50093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 15406 1726854934.50192: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.50736: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # <<< 15406 1726854934.50740: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.50915: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 15406 1726854934.51036: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.51059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15406 1726854934.51146: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.51177: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.51592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15406 1726854934.51595: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.51700: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854934.51837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 15406 1726854934.52555: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15406 1726854934.52577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15406 1726854934.52602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 15406 1726854934.52623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15406 1726854934.52677: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854934.52682: stdout chunk (state=3): >>> # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 15406 1726854934.52704: stdout chunk (state=3): >>> import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c232cbd70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c232c8d40> <<< 15406 1726854934.52780: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c232c8b60><<< 15406 1726854934.52873: stdout chunk (state=3): >>> <<< 15406 1726854934.53731: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday<<< 15406 1726854934.53971: stdout chunk (state=3): >>>": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "34", "epoch": "1726854934", "epoch_int": "1726854934", "date": "2024-09-20", "time": "13:55:34", "iso8601_micro": "2024-09-20T17:55:34.523028Z", "iso8601": "2024-09-20T17:55:34Z", "iso8601_basic": "20240920T135534523028", "iso8601_basic_short": "20240920T135534", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854934.54602: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 15406 1726854934.54691: stdout chunk (state=3): >>> # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv <<< 15406 1726854934.54720: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 15406 1726854934.54763: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 15406 1726854934.54813: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout<<< 15406 1726854934.54837: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp<<< 15406 1726854934.54996: stdout chunk (state=3): >>> # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil<<< 15406 1726854934.55050: stdout chunk (state=3): >>> # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing<<< 15406 1726854934.55070: stdout chunk (state=3): >>> # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner<<< 15406 1726854934.55097: stdout chunk (state=3): >>> # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess<<< 15406 1726854934.55119: stdout chunk (state=3): >>> # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid<<< 15406 1726854934.55144: stdout chunk (state=3): >>> # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket <<< 15406 1726854934.55166: stdout chunk (state=3): >>># cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes<<< 15406 1726854934.55198: stdout chunk (state=3): >>> # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 15406 1726854934.55219: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation<<< 15406 1726854934.55249: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux<<< 15406 1726854934.55418: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos<<< 15406 1726854934.55447: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd<<< 15406 1726854934.55471: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors<<< 15406 1726854934.55498: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system<<< 15406 1726854934.55516: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform<<< 15406 1726854934.55537: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux<<< 15406 1726854934.55556: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd<<< 15406 1726854934.55592: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly <<< 15406 1726854934.55644: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15406 1726854934.56065: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15406 1726854934.56219: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc<<< 15406 1726854934.56222: stdout chunk (state=3): >>> # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib<<< 15406 1726854934.56246: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy ipaddress<<< 15406 1726854934.56279: stdout chunk (state=3): >>> # destroy ntpath<<< 15406 1726854934.56307: stdout chunk (state=3): >>> # destroy importlib <<< 15406 1726854934.56330: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal <<< 15406 1726854934.56347: stdout chunk (state=3): >>># destroy systemd.daemon # destroy hashlib # destroy json.decoder <<< 15406 1726854934.56645: stdout chunk (state=3): >>># destroy json.encoder # destroy json.scanner # destroy _json<<< 15406 1726854934.56648: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 15406 1726854934.56650: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection<<< 15406 1726854934.56652: stdout chunk (state=3): >>> # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 15406 1726854934.56686: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 15406 1726854934.56713: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process<<< 15406 1726854934.56727: stdout chunk (state=3): >>> # destroy unicodedata # destroy tempfile # destroy multiprocessing.util<<< 15406 1726854934.56752: stdout chunk (state=3): >>> # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex<<< 15406 1726854934.56770: stdout chunk (state=3): >>> # destroy fcntl<<< 15406 1726854934.56798: stdout chunk (state=3): >>> # destroy datetime # destroy subprocess<<< 15406 1726854934.56815: stdout chunk (state=3): >>> # destroy base64<<< 15406 1726854934.56844: stdout chunk (state=3): >>> # destroy _ssl <<< 15406 1726854934.56864: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 15406 1726854934.56905: stdout chunk (state=3): >>> # destroy getpass # destroy pwd # destroy termios # destroy errno<<< 15406 1726854934.56940: stdout chunk (state=3): >>> # destroy json # destroy socket<<< 15406 1726854934.57009: stdout chunk (state=3): >>> # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing<<< 15406 1726854934.57071: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna<<< 15406 1726854934.57078: stdout chunk (state=3): >>> # destroy stringprep<<< 15406 1726854934.57099: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 15406 1726854934.57159: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 15406 1726854934.57170: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib<<< 15406 1726854934.57200: stdout chunk (state=3): >>> # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings<<< 15406 1726854934.57216: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix<<< 15406 1726854934.57242: stdout chunk (state=3): >>> # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 15406 1726854934.57263: stdout chunk (state=3): >>> # cleanup[3] wiping _sre<<< 15406 1726854934.57295: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections<<< 15406 1726854934.57307: stdout chunk (state=3): >>> # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath <<< 15406 1726854934.57327: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 15406 1726854934.57356: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 15406 1726854934.57366: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 15406 1726854934.57399: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy selinux._selinux<<< 15406 1726854934.57539: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15406 1726854934.57611: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15406 1726854934.57622: stdout chunk (state=3): >>># destroy _socket <<< 15406 1726854934.57656: stdout chunk (state=3): >>># destroy _collections <<< 15406 1726854934.57706: stdout chunk (state=3): >>># destroy platform<<< 15406 1726854934.57746: stdout chunk (state=3): >>> # destroy _uuid # destroy stat <<< 15406 1726854934.57800: stdout chunk (state=3): >>># destroy genericpath # destroy re._parser <<< 15406 1726854934.57804: stdout chunk (state=3): >>># destroy tokenize<<< 15406 1726854934.57816: stdout chunk (state=3): >>> <<< 15406 1726854934.57877: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing<<< 15406 1726854934.57904: stdout chunk (state=3): >>> # destroy _tokenize<<< 15406 1726854934.57914: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse<<< 15406 1726854934.58042: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15406 1726854934.58104: stdout chunk (state=3): >>># destroy codecs <<< 15406 1726854934.58145: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 <<< 15406 1726854934.58209: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 15406 1726854934.58222: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref<<< 15406 1726854934.58259: stdout chunk (state=3): >>> # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 15406 1726854934.58283: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 15406 1726854934.58312: stdout chunk (state=3): >>> # destroy _random # destroy _weakref <<< 15406 1726854934.58358: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator<<< 15406 1726854934.58391: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools <<< 15406 1726854934.58420: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 15406 1726854934.58438: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 15406 1726854934.58543: stdout chunk (state=3): >>> <<< 15406 1726854934.59104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854934.59107: stdout chunk (state=3): >>><<< 15406 1726854934.59110: stderr chunk (state=3): >>><<< 15406 1726854934.59445: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c244184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c243e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2441aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24209130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24209fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24247e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24247f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2427f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2427ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2425fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2425d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24245040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2429f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2429e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2425e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2429cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d4890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242442c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242d4d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d4bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242d4fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24242de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d56d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d53a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d65d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242ec7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242edeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242eed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242ef380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242ee2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c242efe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242ef530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d6570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23fe3ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400c830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400c590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400c770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400d100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2400daf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400c9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23fe1e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400eea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400dc10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c242d6cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c24037230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2405b590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240bc2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240bea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240bc410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c240813a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2405a3c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2400fe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3c23925670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_k9wvds7b/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2398f0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2396dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2396d190> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2398d3d0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c239beb70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239be900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239be210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239be660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2398fd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c239bf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c239bfad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239bffe0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23829df0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2382ba10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382d5b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382ffb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c238343e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382e330> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23837f50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23836a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c238367b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23836cf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2382e840> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2387c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2387dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387db50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c238802f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387e480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23883aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23880470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c238848f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23884ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23884e60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2387c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2370c3e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2370d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23886ba0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23887f50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c238867e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23715610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23716360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2370d5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237161b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23717530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23722300> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2371cd70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2380ac30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c239ea900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23717290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b23f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c234102f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23410620> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23798e90> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b2f30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b0a70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b05f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c234135f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c23412ea0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c23413050> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c234122d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c234137a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c2346e2a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2346c2c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c237b1b80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2346f5c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2346ee40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c234ae3f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2349e1b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c234c1eb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c2349f3b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3c232cbd70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c232c8d40> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3c232c8b60> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "34", "epoch": "1726854934", "epoch_int": "1726854934", "date": "2024-09-20", "time": "13:55:34", "iso8601_micro": "2024-09-20T17:55:34.523028Z", "iso8601": "2024-09-20T17:55:34Z", "iso8601_basic": "20240920T135534523028", "iso8601_basic_short": "20240920T135534", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15406 1726854934.61138: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854934.61141: _low_level_execute_command(): starting 15406 1726854934.61144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854933.9341307-15529-143904489761080/ > /dev/null 2>&1 && sleep 0' 15406 1726854934.61156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854934.61178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854934.61197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854934.61214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854934.61249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854934.61262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854934.61301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854934.61357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854934.61373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854934.61399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854934.61502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854934.64164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854934.64211: stdout chunk (state=3): >>><<< 15406 1726854934.64214: stderr chunk (state=3): >>><<< 15406 1726854934.64488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854934.64492: handler run complete 15406 1726854934.64495: variable 'ansible_facts' from source: unknown 15406 1726854934.64512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854934.64758: variable 'ansible_facts' from source: unknown 15406 1726854934.64812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854934.64947: attempt loop complete, returning result 15406 1726854934.64955: _execute() done 15406 1726854934.64961: dumping result to json 15406 1726854934.65108: done dumping result, returning 15406 1726854934.65217: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcc66-ac2b-3c83-32d3-00000000008f] 15406 1726854934.65220: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000008f ok: [managed_node2] 15406 1726854934.65412: no more pending results, returning what we have 15406 1726854934.65415: results queue empty 15406 1726854934.65416: checking for any_errors_fatal 15406 1726854934.65417: done checking for any_errors_fatal 15406 1726854934.65418: checking for max_fail_percentage 15406 1726854934.65420: done checking for max_fail_percentage 15406 1726854934.65421: checking to see if all hosts have failed and the running result is not ok 15406 1726854934.65421: done checking to see if all hosts have failed 15406 1726854934.65422: getting the remaining hosts for this loop 15406 1726854934.65423: done getting the remaining hosts for this loop 15406 1726854934.65427: getting the next task for host managed_node2 15406 1726854934.65435: done getting next task for host managed_node2 15406 1726854934.65437: ^ task is: TASK: Check if system is ostree 15406 1726854934.65440: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854934.65443: getting variables 15406 1726854934.65444: in VariableManager get_vars() 15406 1726854934.65471: Calling all_inventory to load vars for managed_node2 15406 1726854934.65474: Calling groups_inventory to load vars for managed_node2 15406 1726854934.65477: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854934.65745: Calling all_plugins_play to load vars for managed_node2 15406 1726854934.65749: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854934.65754: Calling groups_plugins_play to load vars for managed_node2 15406 1726854934.66042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854934.66463: done with get_vars() 15406 1726854934.66472: done getting variables 15406 1726854934.66608: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000008f 15406 1726854934.66613: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:55:34 -0400 (0:00:00.795) 0:00:02.489 ****** 15406 1726854934.66679: entering _queue_task() for managed_node2/stat 15406 1726854934.67310: worker is 1 (out of 1 available) 15406 1726854934.67322: exiting _queue_task() for managed_node2/stat 15406 1726854934.67335: done queuing things up, now waiting for results queue to drain 15406 1726854934.67336: waiting for pending results... 15406 1726854934.67585: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 15406 1726854934.67758: in run() - task 0affcc66-ac2b-3c83-32d3-000000000091 15406 1726854934.67774: variable 'ansible_search_path' from source: unknown 15406 1726854934.67783: variable 'ansible_search_path' from source: unknown 15406 1726854934.67827: calling self._execute() 15406 1726854934.67902: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854934.67913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854934.67927: variable 'omit' from source: magic vars 15406 1726854934.68795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854934.69191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854934.69311: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854934.69353: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854934.69477: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854934.69600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854934.69653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854934.69694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854934.69725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854934.69861: Evaluated conditional (not __network_is_ostree is defined): True 15406 1726854934.69873: variable 'omit' from source: magic vars 15406 1726854934.69922: variable 'omit' from source: magic vars 15406 1726854934.69960: variable 'omit' from source: magic vars 15406 1726854934.69993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854934.70026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854934.70047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854934.70067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854934.70083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854934.70120: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854934.70129: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854934.70137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854934.70240: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854934.70251: Set connection var ansible_timeout to 10 15406 1726854934.70311: Set connection var ansible_connection to ssh 15406 1726854934.70314: Set connection var ansible_shell_type to sh 15406 1726854934.70316: Set connection var ansible_shell_executable to /bin/sh 15406 1726854934.70318: Set connection var ansible_pipelining to False 15406 1726854934.70319: variable 'ansible_shell_executable' from source: unknown 15406 1726854934.70321: variable 'ansible_connection' from source: unknown 15406 1726854934.70324: variable 'ansible_module_compression' from source: unknown 15406 1726854934.70330: variable 'ansible_shell_type' from source: unknown 15406 1726854934.70338: variable 'ansible_shell_executable' from source: unknown 15406 1726854934.70345: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854934.70353: variable 'ansible_pipelining' from source: unknown 15406 1726854934.70360: variable 'ansible_timeout' from source: unknown 15406 1726854934.70368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854934.70512: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854934.70532: variable 'omit' from source: magic vars 15406 1726854934.70592: starting attempt loop 15406 1726854934.70595: running the handler 15406 1726854934.70597: _low_level_execute_command(): starting 15406 1726854934.70600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854934.71248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854934.71306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854934.71378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854934.71411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854934.71542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854934.73766: stdout chunk (state=3): >>>/root <<< 15406 1726854934.73975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854934.73978: stdout chunk (state=3): >>><<< 15406 1726854934.73984: stderr chunk (state=3): >>><<< 15406 1726854934.74006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854934.74111: _low_level_execute_command(): starting 15406 1726854934.74115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986 `" && echo ansible-tmp-1726854934.7401934-15566-22937808258986="` echo /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986 `" ) && sleep 0' 15406 1726854934.74678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854934.74698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854934.74714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854934.74828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854934.74858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854934.74972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854934.77609: stdout chunk (state=3): >>>ansible-tmp-1726854934.7401934-15566-22937808258986=/root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986 <<< 15406 1726854934.77821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854934.77850: stdout chunk (state=3): >>><<< 15406 1726854934.77853: stderr chunk (state=3): >>><<< 15406 1726854934.77993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854934.7401934-15566-22937808258986=/root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854934.77996: variable 'ansible_module_compression' from source: unknown 15406 1726854934.77998: ANSIBALLZ: Using lock for stat 15406 1726854934.78000: ANSIBALLZ: Acquiring lock 15406 1726854934.78002: ANSIBALLZ: Lock acquired: 140626835987088 15406 1726854934.78004: ANSIBALLZ: Creating module 15406 1726854934.92823: ANSIBALLZ: Writing module into payload 15406 1726854934.92963: ANSIBALLZ: Writing module 15406 1726854934.92995: ANSIBALLZ: Renaming module 15406 1726854934.93147: ANSIBALLZ: Done creating module 15406 1726854934.93397: variable 'ansible_facts' from source: unknown 15406 1726854934.93401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py 15406 1726854934.93704: Sending initial data 15406 1726854934.93716: Sent initial data (152 bytes) 15406 1726854934.95232: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854934.95253: stderr chunk (state=3): >>>debug2: match found <<< 15406 1726854934.95346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854934.95574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854934.95668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854934.97996: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854934.98076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854934.98196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp5wcgi1z9 /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py <<< 15406 1726854934.98200: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py" <<< 15406 1726854934.98373: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp5wcgi1z9" to remote "/root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py" <<< 15406 1726854934.99936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854934.99939: stderr chunk (state=3): >>><<< 15406 1726854934.99942: stdout chunk (state=3): >>><<< 15406 1726854934.99944: done transferring module to remote 15406 1726854934.99946: _low_level_execute_command(): starting 15406 1726854934.99948: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/ /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py && sleep 0' 15406 1726854935.01248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854935.01293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854935.01297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854935.01465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854935.01552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854935.04200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854935.04459: stderr chunk (state=3): >>><<< 15406 1726854935.04462: stdout chunk (state=3): >>><<< 15406 1726854935.04467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854935.04469: _low_level_execute_command(): starting 15406 1726854935.04472: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/AnsiballZ_stat.py && sleep 0' 15406 1726854935.05584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854935.05703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854935.05708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854935.05710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854935.05713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854935.05715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854935.05767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854935.05899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854935.08916: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15406 1726854935.09051: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15406 1726854935.09143: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 15406 1726854935.09155: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15406 1726854935.09293: stdout chunk (state=3): >>>import 'time' # <<< 15406 1726854935.09307: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 15406 1726854935.09331: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 15406 1726854935.09450: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9188184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9187e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91881aa50> import '_signal' # <<< 15406 1726854935.09474: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 15406 1726854935.09498: stdout chunk (state=3): >>>import 'io' # <<< 15406 1726854935.09543: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15406 1726854935.09666: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15406 1726854935.09699: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15406 1726854935.09801: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 15406 1726854935.09814: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15406 1726854935.09903: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15406 1726854935.09909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91862d130> <<< 15406 1726854935.10046: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91862dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15406 1726854935.10461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15406 1726854935.10478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854935.10498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15406 1726854935.10716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91866bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15406 1726854935.10742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91866bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15406 1726854935.10768: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15406 1726854935.10828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854935.10853: stdout chunk (state=3): >>>import 'itertools' # <<< 15406 1726854935.10886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186a3830> <<< 15406 1726854935.10932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15406 1726854935.10955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186a3ec0> import '_collections' # <<< 15406 1726854935.11019: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918683b60> <<< 15406 1726854935.11266: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918669070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15406 1726854935.11299: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15406 1726854935.11303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15406 1726854935.11332: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15406 1726854935.11390: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186c37d0> <<< 15406 1726854935.11415: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186c23f0> <<< 15406 1726854935.11438: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918682150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186c0bc0> <<< 15406 1726854935.11524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186682f0> <<< 15406 1726854935.11551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15406 1726854935.11600: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854935.11815: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9186f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9186f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918666e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f9370> import 'importlib.machinery' # <<< 15406 1726854935.12016: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918710740> import 'errno' # <<< 15406 1726854935.12066: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc918711e20> <<< 15406 1726854935.12069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15406 1726854935.12091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15406 1726854935.12117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15406 1726854935.12135: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918712cc0> <<< 15406 1726854935.12171: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9187132f0> <<< 15406 1726854935.12209: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918712210> <<< 15406 1726854935.12222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15406 1726854935.12262: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854935.12366: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc918713d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9187134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186fa4b0> <<< 15406 1726854935.12369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15406 1726854935.12393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15406 1726854935.12416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15406 1726854935.12502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184d3c50> <<< 15406 1726854935.12517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15406 1726854935.12553: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fc710> <<< 15406 1726854935.12696: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15406 1726854935.12712: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854935.12892: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fd070> <<< 15406 1726854935.13118: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fc920> <<< 15406 1726854935.13139: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184d1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15406 1726854935.13522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185271a0> <<< 15406 1726854935.13594: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15406 1726854935.13617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15406 1726854935.13649: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91854b560> <<< 15406 1726854935.13673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15406 1726854935.13844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185ac2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15406 1726854935.13871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15406 1726854935.13901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15406 1726854935.13952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15406 1726854935.14073: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185aea20> <<< 15406 1726854935.14170: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185ac3e0> <<< 15406 1726854935.14218: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91856d2b0> <<< 15406 1726854935.14254: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9183b53d0> <<< 15406 1726854935.14292: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91854a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184ffd70> <<< 15406 1726854935.14451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15406 1726854935.14506: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc9183b5670> <<< 15406 1726854935.14665: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_9i_1z9i9/ansible_stat_payload.zip' # zipimport: zlib available<<< 15406 1726854935.14845: stdout chunk (state=3): >>> <<< 15406 1726854935.14891: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.14927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15406 1726854935.14945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15406 1726854935.14990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15406 1726854935.15094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15406 1726854935.15157: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91840b170> <<< 15406 1726854935.15176: stdout chunk (state=3): >>>import '_typing' # <<< 15406 1726854935.15420: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9183ea060> <<< 15406 1726854935.15443: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9183e91f0> <<< 15406 1726854935.15492: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 15406 1726854935.15509: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854935.15604: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.15607: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 15406 1726854935.17989: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.19543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 15406 1726854935.19661: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918409040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc918432ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918432840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918432150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15406 1726854935.19679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15406 1726854935.19722: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184328a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91840bb90> <<< 15406 1726854935.19984: stdout chunk (state=3): >>>import 'atexit' # <<< 15406 1726854935.20000: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184337d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184339e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918433ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15406 1726854935.20028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15406 1726854935.20080: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d11ca0> <<< 15406 1726854935.20109: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d138c0> <<< 15406 1726854935.20131: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15406 1726854935.20160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15406 1726854935.20364: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d15190> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15406 1726854935.20439: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d17e90> <<< 15406 1726854935.20483: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d17dd0> <<< 15406 1726854935.20508: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d16150> <<< 15406 1726854935.20538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15406 1726854935.20601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15406 1726854935.20735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15406 1726854935.20846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1fec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1e990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1e6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15406 1726854935.20941: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1ec60> <<< 15406 1726854935.20977: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d16660> <<< 15406 1726854935.21210: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d67ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d681d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15406 1726854935.21218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d69c70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d69a30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15406 1726854935.21593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d6c200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d6a360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15406 1726854935.21741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d6f9b0> <<< 15406 1726854935.21899: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d6c380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d70cb0> <<< 15406 1726854935.21930: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d70a10> <<< 15406 1726854935.21994: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d70c80> <<< 15406 1726854935.22023: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d68350> <<< 15406 1726854935.22032: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 15406 1726854935.22036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15406 1726854935.22062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15406 1726854935.22110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15406 1726854935.22117: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854935.22372: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917df81a0> <<< 15406 1726854935.22377: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15406 1726854935.22404: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917df9160> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d72930> <<< 15406 1726854935.22440: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d73ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d72540> <<< 15406 1726854935.22483: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.22593: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15406 1726854935.22639: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.22992: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15406 1726854935.23034: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.23248: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.24176: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.24941: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15406 1726854935.25056: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854935.25063: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917c01490> <<< 15406 1726854935.25317: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c027e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917df9340> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854935.25323: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 15406 1726854935.25547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.25627: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.25811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15406 1726854935.25824: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c02960> <<< 15406 1726854935.25954: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.26548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.27613: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15406 1726854935.27706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.27829: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15406 1726854935.27864: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 15406 1726854935.27885: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.27930: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.28067: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15406 1726854935.28070: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.28331: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.28706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15406 1726854935.28794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15406 1726854935.29050: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c03440> # zipimport: zlib available # zipimport: zlib available <<< 15406 1726854935.29103: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15406 1726854935.29123: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15406 1726854935.29198: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.29217: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.29390: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15406 1726854935.29393: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854935.29403: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.29478: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.29573: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15406 1726854935.29626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15406 1726854935.29739: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917c0df70> <<< 15406 1726854935.29797: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c0b230> <<< 15406 1726854935.29926: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15406 1726854935.29949: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15406 1726854935.30362: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918486900> <<< 15406 1726854935.30409: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91846e5d0> <<< 15406 1726854935.30519: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c0e120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c02e40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15406 1726854935.30538: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.30592: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.30615: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15406 1726854935.30726: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15406 1726854935.30744: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15406 1726854935.30809: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.31143: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.31250: stdout chunk (state=3): >>># zipimport: zlib available <<< 15406 1726854935.31460: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 15406 1726854935.31885: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 15406 1726854935.32014: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat<<< 15406 1726854935.32132: stdout chunk (state=3): >>> # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre <<< 15406 1726854935.32275: stdout chunk (state=3): >>># cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15406 1726854935.32584: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 15406 1726854935.32618: stdout chunk (state=3): >>># destroy ntpath <<< 15406 1726854935.32803: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 15406 1726854935.32807: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 15406 1726854935.32828: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 15406 1726854935.33015: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 15406 1726854935.33033: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15406 1726854935.33176: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15406 1726854935.33201: stdout chunk (state=3): >>># destroy _collections <<< 15406 1726854935.33248: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15406 1726854935.33389: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15406 1726854935.33392: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15406 1726854935.33563: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 15406 1726854935.33617: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 15406 1726854935.33692: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15406 1726854935.34130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854935.34144: stderr chunk (state=3): >>><<< 15406 1726854935.34391: stdout chunk (state=3): >>><<< 15406 1726854935.34407: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9188184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9187e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91881aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91862d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91862dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91866bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91866bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918683b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918669070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918682150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9186f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9186f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918666e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918710740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc918711e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918712cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9187132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918712210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc918713d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9187134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fd070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184fda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184d1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184fdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9186fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185271a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91854b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185ac2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185aea20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9185ac3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91856d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9183b53d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91854a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184ffd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc9183b5670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_9i_1z9i9/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91840b170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9183ea060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9183e91f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918409040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc918432ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918432840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918432150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc9184328a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91840bb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184337d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc9184339e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918433ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d11ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d15190> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d17e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d17dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d16150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1fec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1e990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1e6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d1ec60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d16660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d67ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d681d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d69c70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d69a30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d6c200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d6a360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d6f9b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d6c380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d70cb0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d70a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d70c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d68350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917df81a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917df9160> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d72930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917d73ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917d72540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917c01490> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c027e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917df9340> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c02960> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c03440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc917c0df70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c0b230> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc918486900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc91846e5d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c0e120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc917c02e40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15406 1726854935.35737: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854935.35741: _low_level_execute_command(): starting 15406 1726854935.35743: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854934.7401934-15566-22937808258986/ > /dev/null 2>&1 && sleep 0' 15406 1726854935.36289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854935.36351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854935.36372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854935.36513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854935.38991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854935.39008: stdout chunk (state=3): >>><<< 15406 1726854935.39025: stderr chunk (state=3): >>><<< 15406 1726854935.39064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854935.39103: handler run complete 15406 1726854935.39293: attempt loop complete, returning result 15406 1726854935.39297: _execute() done 15406 1726854935.39299: dumping result to json 15406 1726854935.39301: done dumping result, returning 15406 1726854935.39303: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affcc66-ac2b-3c83-32d3-000000000091] 15406 1726854935.39305: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000091 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15406 1726854935.39438: no more pending results, returning what we have 15406 1726854935.39442: results queue empty 15406 1726854935.39442: checking for any_errors_fatal 15406 1726854935.39449: done checking for any_errors_fatal 15406 1726854935.39449: checking for max_fail_percentage 15406 1726854935.39451: done checking for max_fail_percentage 15406 1726854935.39452: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.39453: done checking to see if all hosts have failed 15406 1726854935.39453: getting the remaining hosts for this loop 15406 1726854935.39456: done getting the remaining hosts for this loop 15406 1726854935.39460: getting the next task for host managed_node2 15406 1726854935.39468: done getting next task for host managed_node2 15406 1726854935.39471: ^ task is: TASK: Set flag to indicate system is ostree 15406 1726854935.39473: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.39477: getting variables 15406 1726854935.39479: in VariableManager get_vars() 15406 1726854935.39513: Calling all_inventory to load vars for managed_node2 15406 1726854935.39516: Calling groups_inventory to load vars for managed_node2 15406 1726854935.39520: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.39532: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.39535: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.39538: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.40012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.40605: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000091 15406 1726854935.40608: WORKER PROCESS EXITING 15406 1726854935.40669: done with get_vars() 15406 1726854935.40680: done getting variables 15406 1726854935.40896: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:55:35 -0400 (0:00:00.743) 0:00:03.233 ****** 15406 1726854935.41081: entering _queue_task() for managed_node2/set_fact 15406 1726854935.41083: Creating lock for set_fact 15406 1726854935.42016: worker is 1 (out of 1 available) 15406 1726854935.42028: exiting _queue_task() for managed_node2/set_fact 15406 1726854935.42039: done queuing things up, now waiting for results queue to drain 15406 1726854935.42041: waiting for pending results... 15406 1726854935.42626: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 15406 1726854935.42797: in run() - task 0affcc66-ac2b-3c83-32d3-000000000092 15406 1726854935.42811: variable 'ansible_search_path' from source: unknown 15406 1726854935.42814: variable 'ansible_search_path' from source: unknown 15406 1726854935.43176: calling self._execute() 15406 1726854935.43258: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.43262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.43394: variable 'omit' from source: magic vars 15406 1726854935.44632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854935.45668: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854935.45864: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854935.45902: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854935.46292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854935.46297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854935.46299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854935.46302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854935.46304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854935.46611: Evaluated conditional (not __network_is_ostree is defined): True 15406 1726854935.46623: variable 'omit' from source: magic vars 15406 1726854935.46666: variable 'omit' from source: magic vars 15406 1726854935.47090: variable '__ostree_booted_stat' from source: set_fact 15406 1726854935.47234: variable 'omit' from source: magic vars 15406 1726854935.47319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854935.47352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854935.47589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854935.47593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854935.47596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854935.47598: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854935.47600: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.47603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.47771: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854935.47812: Set connection var ansible_timeout to 10 15406 1726854935.47992: Set connection var ansible_connection to ssh 15406 1726854935.47995: Set connection var ansible_shell_type to sh 15406 1726854935.47998: Set connection var ansible_shell_executable to /bin/sh 15406 1726854935.48000: Set connection var ansible_pipelining to False 15406 1726854935.48002: variable 'ansible_shell_executable' from source: unknown 15406 1726854935.48005: variable 'ansible_connection' from source: unknown 15406 1726854935.48007: variable 'ansible_module_compression' from source: unknown 15406 1726854935.48009: variable 'ansible_shell_type' from source: unknown 15406 1726854935.48011: variable 'ansible_shell_executable' from source: unknown 15406 1726854935.48015: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.48017: variable 'ansible_pipelining' from source: unknown 15406 1726854935.48019: variable 'ansible_timeout' from source: unknown 15406 1726854935.48022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.48295: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854935.48299: variable 'omit' from source: magic vars 15406 1726854935.48301: starting attempt loop 15406 1726854935.48304: running the handler 15406 1726854935.48306: handler run complete 15406 1726854935.48308: attempt loop complete, returning result 15406 1726854935.48310: _execute() done 15406 1726854935.48312: dumping result to json 15406 1726854935.48314: done dumping result, returning 15406 1726854935.48316: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affcc66-ac2b-3c83-32d3-000000000092] 15406 1726854935.48318: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000092 ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15406 1726854935.48546: no more pending results, returning what we have 15406 1726854935.48549: results queue empty 15406 1726854935.48550: checking for any_errors_fatal 15406 1726854935.48555: done checking for any_errors_fatal 15406 1726854935.48556: checking for max_fail_percentage 15406 1726854935.48558: done checking for max_fail_percentage 15406 1726854935.48559: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.48560: done checking to see if all hosts have failed 15406 1726854935.48561: getting the remaining hosts for this loop 15406 1726854935.48562: done getting the remaining hosts for this loop 15406 1726854935.48566: getting the next task for host managed_node2 15406 1726854935.48574: done getting next task for host managed_node2 15406 1726854935.48577: ^ task is: TASK: Fix CentOS6 Base repo 15406 1726854935.48579: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.48583: getting variables 15406 1726854935.48585: in VariableManager get_vars() 15406 1726854935.48619: Calling all_inventory to load vars for managed_node2 15406 1726854935.48621: Calling groups_inventory to load vars for managed_node2 15406 1726854935.48625: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.48635: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.48638: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.48640: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.49238: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000092 15406 1726854935.49248: WORKER PROCESS EXITING 15406 1726854935.49550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.50094: done with get_vars() 15406 1726854935.50217: done getting variables 15406 1726854935.50553: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:55:35 -0400 (0:00:00.094) 0:00:03.328 ****** 15406 1726854935.50583: entering _queue_task() for managed_node2/copy 15406 1726854935.51158: worker is 1 (out of 1 available) 15406 1726854935.51168: exiting _queue_task() for managed_node2/copy 15406 1726854935.51180: done queuing things up, now waiting for results queue to drain 15406 1726854935.51181: waiting for pending results... 15406 1726854935.51638: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 15406 1726854935.51842: in run() - task 0affcc66-ac2b-3c83-32d3-000000000094 15406 1726854935.51854: variable 'ansible_search_path' from source: unknown 15406 1726854935.51857: variable 'ansible_search_path' from source: unknown 15406 1726854935.51994: calling self._execute() 15406 1726854935.52065: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.52147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.52162: variable 'omit' from source: magic vars 15406 1726854935.53092: variable 'ansible_distribution' from source: facts 15406 1726854935.53216: Evaluated conditional (ansible_distribution == 'CentOS'): True 15406 1726854935.53492: variable 'ansible_distribution_major_version' from source: facts 15406 1726854935.53496: Evaluated conditional (ansible_distribution_major_version == '6'): False 15406 1726854935.53499: when evaluation is False, skipping this task 15406 1726854935.53501: _execute() done 15406 1726854935.53504: dumping result to json 15406 1726854935.53506: done dumping result, returning 15406 1726854935.53510: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affcc66-ac2b-3c83-32d3-000000000094] 15406 1726854935.53512: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000094 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15406 1726854935.53720: no more pending results, returning what we have 15406 1726854935.53723: results queue empty 15406 1726854935.53724: checking for any_errors_fatal 15406 1726854935.53729: done checking for any_errors_fatal 15406 1726854935.53730: checking for max_fail_percentage 15406 1726854935.53731: done checking for max_fail_percentage 15406 1726854935.53732: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.53733: done checking to see if all hosts have failed 15406 1726854935.53734: getting the remaining hosts for this loop 15406 1726854935.53735: done getting the remaining hosts for this loop 15406 1726854935.53740: getting the next task for host managed_node2 15406 1726854935.53746: done getting next task for host managed_node2 15406 1726854935.53749: ^ task is: TASK: Include the task 'enable_epel.yml' 15406 1726854935.53752: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.53755: getting variables 15406 1726854935.53757: in VariableManager get_vars() 15406 1726854935.53785: Calling all_inventory to load vars for managed_node2 15406 1726854935.53790: Calling groups_inventory to load vars for managed_node2 15406 1726854935.53794: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.53808: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.53811: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.53814: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.54295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.54904: done with get_vars() 15406 1726854935.54919: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000094 15406 1726854935.54922: WORKER PROCESS EXITING 15406 1726854935.54927: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:55:35 -0400 (0:00:00.045) 0:00:03.374 ****** 15406 1726854935.55131: entering _queue_task() for managed_node2/include_tasks 15406 1726854935.55738: worker is 1 (out of 1 available) 15406 1726854935.55750: exiting _queue_task() for managed_node2/include_tasks 15406 1726854935.55761: done queuing things up, now waiting for results queue to drain 15406 1726854935.55763: waiting for pending results... 15406 1726854935.56147: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 15406 1726854935.56482: in run() - task 0affcc66-ac2b-3c83-32d3-000000000095 15406 1726854935.56486: variable 'ansible_search_path' from source: unknown 15406 1726854935.56490: variable 'ansible_search_path' from source: unknown 15406 1726854935.56507: calling self._execute() 15406 1726854935.56584: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.56811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.56815: variable 'omit' from source: magic vars 15406 1726854935.57854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854935.60384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854935.60490: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854935.60544: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854935.60597: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854935.60636: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854935.60736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854935.60762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854935.60842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854935.60846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854935.60868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854935.60998: variable '__network_is_ostree' from source: set_fact 15406 1726854935.61019: Evaluated conditional (not __network_is_ostree | d(false)): True 15406 1726854935.61029: _execute() done 15406 1726854935.61036: dumping result to json 15406 1726854935.61042: done dumping result, returning 15406 1726854935.61061: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affcc66-ac2b-3c83-32d3-000000000095] 15406 1726854935.61170: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000095 15406 1726854935.61239: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000095 15406 1726854935.61242: WORKER PROCESS EXITING 15406 1726854935.61299: no more pending results, returning what we have 15406 1726854935.61305: in VariableManager get_vars() 15406 1726854935.61340: Calling all_inventory to load vars for managed_node2 15406 1726854935.61342: Calling groups_inventory to load vars for managed_node2 15406 1726854935.61346: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.61357: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.61360: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.61363: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.62183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.62946: done with get_vars() 15406 1726854935.62954: variable 'ansible_search_path' from source: unknown 15406 1726854935.62955: variable 'ansible_search_path' from source: unknown 15406 1726854935.62994: we have included files to process 15406 1726854935.62995: generating all_blocks data 15406 1726854935.62997: done generating all_blocks data 15406 1726854935.63002: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15406 1726854935.63003: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15406 1726854935.63006: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15406 1726854935.64419: done processing included file 15406 1726854935.64422: iterating over new_blocks loaded from include file 15406 1726854935.64424: in VariableManager get_vars() 15406 1726854935.64552: done with get_vars() 15406 1726854935.64554: filtering new block on tags 15406 1726854935.64578: done filtering new block on tags 15406 1726854935.64581: in VariableManager get_vars() 15406 1726854935.64656: done with get_vars() 15406 1726854935.64658: filtering new block on tags 15406 1726854935.64671: done filtering new block on tags 15406 1726854935.64674: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 15406 1726854935.64679: extending task lists for all hosts with included blocks 15406 1726854935.64902: done extending task lists 15406 1726854935.64903: done processing included files 15406 1726854935.64904: results queue empty 15406 1726854935.64905: checking for any_errors_fatal 15406 1726854935.64908: done checking for any_errors_fatal 15406 1726854935.64909: checking for max_fail_percentage 15406 1726854935.64910: done checking for max_fail_percentage 15406 1726854935.64911: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.64911: done checking to see if all hosts have failed 15406 1726854935.64912: getting the remaining hosts for this loop 15406 1726854935.64913: done getting the remaining hosts for this loop 15406 1726854935.64916: getting the next task for host managed_node2 15406 1726854935.64920: done getting next task for host managed_node2 15406 1726854935.64922: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15406 1726854935.64924: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.64926: getting variables 15406 1726854935.64927: in VariableManager get_vars() 15406 1726854935.64935: Calling all_inventory to load vars for managed_node2 15406 1726854935.64938: Calling groups_inventory to load vars for managed_node2 15406 1726854935.64940: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.64947: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.64955: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.64958: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.65346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.65752: done with get_vars() 15406 1726854935.65761: done getting variables 15406 1726854935.65882: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 15406 1726854935.66310: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:55:35 -0400 (0:00:00.112) 0:00:03.486 ****** 15406 1726854935.66355: entering _queue_task() for managed_node2/command 15406 1726854935.66357: Creating lock for command 15406 1726854935.67097: worker is 1 (out of 1 available) 15406 1726854935.67112: exiting _queue_task() for managed_node2/command 15406 1726854935.67124: done queuing things up, now waiting for results queue to drain 15406 1726854935.67126: waiting for pending results... 15406 1726854935.67699: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 15406 1726854935.67893: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000af 15406 1726854935.68012: variable 'ansible_search_path' from source: unknown 15406 1726854935.68016: variable 'ansible_search_path' from source: unknown 15406 1726854935.68018: calling self._execute() 15406 1726854935.68139: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.68150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.68164: variable 'omit' from source: magic vars 15406 1726854935.68967: variable 'ansible_distribution' from source: facts 15406 1726854935.68990: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15406 1726854935.69282: variable 'ansible_distribution_major_version' from source: facts 15406 1726854935.69309: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15406 1726854935.69312: when evaluation is False, skipping this task 15406 1726854935.69315: _execute() done 15406 1726854935.69318: dumping result to json 15406 1726854935.69418: done dumping result, returning 15406 1726854935.69421: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0affcc66-ac2b-3c83-32d3-0000000000af] 15406 1726854935.69424: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000af 15406 1726854935.69722: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000af 15406 1726854935.69725: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15406 1726854935.69779: no more pending results, returning what we have 15406 1726854935.69783: results queue empty 15406 1726854935.69784: checking for any_errors_fatal 15406 1726854935.69785: done checking for any_errors_fatal 15406 1726854935.69786: checking for max_fail_percentage 15406 1726854935.69789: done checking for max_fail_percentage 15406 1726854935.69791: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.69791: done checking to see if all hosts have failed 15406 1726854935.69792: getting the remaining hosts for this loop 15406 1726854935.69793: done getting the remaining hosts for this loop 15406 1726854935.69797: getting the next task for host managed_node2 15406 1726854935.69804: done getting next task for host managed_node2 15406 1726854935.69807: ^ task is: TASK: Install yum-utils package 15406 1726854935.69810: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.69815: getting variables 15406 1726854935.69816: in VariableManager get_vars() 15406 1726854935.69847: Calling all_inventory to load vars for managed_node2 15406 1726854935.69850: Calling groups_inventory to load vars for managed_node2 15406 1726854935.69854: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.69868: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.69872: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.69875: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.70364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.70563: done with get_vars() 15406 1726854935.70574: done getting variables 15406 1726854935.70702: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:55:35 -0400 (0:00:00.043) 0:00:03.530 ****** 15406 1726854935.70733: entering _queue_task() for managed_node2/package 15406 1726854935.70735: Creating lock for package 15406 1726854935.71081: worker is 1 (out of 1 available) 15406 1726854935.71208: exiting _queue_task() for managed_node2/package 15406 1726854935.71223: done queuing things up, now waiting for results queue to drain 15406 1726854935.71224: waiting for pending results... 15406 1726854935.71439: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 15406 1726854935.71540: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000b0 15406 1726854935.71563: variable 'ansible_search_path' from source: unknown 15406 1726854935.71644: variable 'ansible_search_path' from source: unknown 15406 1726854935.71648: calling self._execute() 15406 1726854935.71724: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.71735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.71756: variable 'omit' from source: magic vars 15406 1726854935.72168: variable 'ansible_distribution' from source: facts 15406 1726854935.72204: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15406 1726854935.72353: variable 'ansible_distribution_major_version' from source: facts 15406 1726854935.72410: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15406 1726854935.72413: when evaluation is False, skipping this task 15406 1726854935.72416: _execute() done 15406 1726854935.72418: dumping result to json 15406 1726854935.72421: done dumping result, returning 15406 1726854935.72423: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affcc66-ac2b-3c83-32d3-0000000000b0] 15406 1726854935.72427: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b0 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15406 1726854935.72697: no more pending results, returning what we have 15406 1726854935.72701: results queue empty 15406 1726854935.72702: checking for any_errors_fatal 15406 1726854935.72711: done checking for any_errors_fatal 15406 1726854935.72712: checking for max_fail_percentage 15406 1726854935.72715: done checking for max_fail_percentage 15406 1726854935.72716: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.72717: done checking to see if all hosts have failed 15406 1726854935.72718: getting the remaining hosts for this loop 15406 1726854935.72719: done getting the remaining hosts for this loop 15406 1726854935.72731: getting the next task for host managed_node2 15406 1726854935.72738: done getting next task for host managed_node2 15406 1726854935.72741: ^ task is: TASK: Enable EPEL 7 15406 1726854935.72744: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.72750: getting variables 15406 1726854935.72752: in VariableManager get_vars() 15406 1726854935.72782: Calling all_inventory to load vars for managed_node2 15406 1726854935.72785: Calling groups_inventory to load vars for managed_node2 15406 1726854935.72791: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.72806: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.72809: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.72813: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.73174: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b0 15406 1726854935.73177: WORKER PROCESS EXITING 15406 1726854935.73208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.73436: done with get_vars() 15406 1726854935.73445: done getting variables 15406 1726854935.73522: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:55:35 -0400 (0:00:00.028) 0:00:03.558 ****** 15406 1726854935.73560: entering _queue_task() for managed_node2/command 15406 1726854935.73866: worker is 1 (out of 1 available) 15406 1726854935.73879: exiting _queue_task() for managed_node2/command 15406 1726854935.73894: done queuing things up, now waiting for results queue to drain 15406 1726854935.73896: waiting for pending results... 15406 1726854935.74132: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 15406 1726854935.74282: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000b1 15406 1726854935.74305: variable 'ansible_search_path' from source: unknown 15406 1726854935.74318: variable 'ansible_search_path' from source: unknown 15406 1726854935.74370: calling self._execute() 15406 1726854935.74450: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.74461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.74489: variable 'omit' from source: magic vars 15406 1726854935.74990: variable 'ansible_distribution' from source: facts 15406 1726854935.75026: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15406 1726854935.75206: variable 'ansible_distribution_major_version' from source: facts 15406 1726854935.75236: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15406 1726854935.75240: when evaluation is False, skipping this task 15406 1726854935.75242: _execute() done 15406 1726854935.75244: dumping result to json 15406 1726854935.75263: done dumping result, returning 15406 1726854935.75267: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affcc66-ac2b-3c83-32d3-0000000000b1] 15406 1726854935.75293: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b1 15406 1726854935.75424: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b1 15406 1726854935.75427: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15406 1726854935.75505: no more pending results, returning what we have 15406 1726854935.75509: results queue empty 15406 1726854935.75510: checking for any_errors_fatal 15406 1726854935.75515: done checking for any_errors_fatal 15406 1726854935.75515: checking for max_fail_percentage 15406 1726854935.75518: done checking for max_fail_percentage 15406 1726854935.75519: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.75520: done checking to see if all hosts have failed 15406 1726854935.75520: getting the remaining hosts for this loop 15406 1726854935.75522: done getting the remaining hosts for this loop 15406 1726854935.75525: getting the next task for host managed_node2 15406 1726854935.75533: done getting next task for host managed_node2 15406 1726854935.75535: ^ task is: TASK: Enable EPEL 8 15406 1726854935.75539: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.75544: getting variables 15406 1726854935.75545: in VariableManager get_vars() 15406 1726854935.75577: Calling all_inventory to load vars for managed_node2 15406 1726854935.75580: Calling groups_inventory to load vars for managed_node2 15406 1726854935.75585: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.75603: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.75607: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.75611: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.76019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.76216: done with get_vars() 15406 1726854935.76229: done getting variables 15406 1726854935.76304: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:55:35 -0400 (0:00:00.027) 0:00:03.586 ****** 15406 1726854935.76334: entering _queue_task() for managed_node2/command 15406 1726854935.76803: worker is 1 (out of 1 available) 15406 1726854935.76905: exiting _queue_task() for managed_node2/command 15406 1726854935.76992: done queuing things up, now waiting for results queue to drain 15406 1726854935.76994: waiting for pending results... 15406 1726854935.77461: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 15406 1726854935.77509: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000b2 15406 1726854935.77577: variable 'ansible_search_path' from source: unknown 15406 1726854935.77592: variable 'ansible_search_path' from source: unknown 15406 1726854935.77785: calling self._execute() 15406 1726854935.77849: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.78093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.78097: variable 'omit' from source: magic vars 15406 1726854935.78617: variable 'ansible_distribution' from source: facts 15406 1726854935.78635: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15406 1726854935.78972: variable 'ansible_distribution_major_version' from source: facts 15406 1726854935.79091: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15406 1726854935.79227: when evaluation is False, skipping this task 15406 1726854935.79235: _execute() done 15406 1726854935.79238: dumping result to json 15406 1726854935.79243: done dumping result, returning 15406 1726854935.79246: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affcc66-ac2b-3c83-32d3-0000000000b2] 15406 1726854935.79249: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b2 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15406 1726854935.79814: no more pending results, returning what we have 15406 1726854935.79817: results queue empty 15406 1726854935.79818: checking for any_errors_fatal 15406 1726854935.79823: done checking for any_errors_fatal 15406 1726854935.79824: checking for max_fail_percentage 15406 1726854935.79825: done checking for max_fail_percentage 15406 1726854935.79827: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.79829: done checking to see if all hosts have failed 15406 1726854935.79830: getting the remaining hosts for this loop 15406 1726854935.79831: done getting the remaining hosts for this loop 15406 1726854935.79835: getting the next task for host managed_node2 15406 1726854935.79845: done getting next task for host managed_node2 15406 1726854935.79847: ^ task is: TASK: Enable EPEL 6 15406 1726854935.79851: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.79857: getting variables 15406 1726854935.79859: in VariableManager get_vars() 15406 1726854935.80058: Calling all_inventory to load vars for managed_node2 15406 1726854935.80061: Calling groups_inventory to load vars for managed_node2 15406 1726854935.80064: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.80326: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.80330: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.80467: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.81200: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b2 15406 1726854935.81203: WORKER PROCESS EXITING 15406 1726854935.81224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.81811: done with get_vars() 15406 1726854935.81822: done getting variables 15406 1726854935.81874: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:55:35 -0400 (0:00:00.055) 0:00:03.641 ****** 15406 1726854935.81905: entering _queue_task() for managed_node2/copy 15406 1726854935.82463: worker is 1 (out of 1 available) 15406 1726854935.82475: exiting _queue_task() for managed_node2/copy 15406 1726854935.82492: done queuing things up, now waiting for results queue to drain 15406 1726854935.82494: waiting for pending results... 15406 1726854935.82678: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 15406 1726854935.83131: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000b4 15406 1726854935.83135: variable 'ansible_search_path' from source: unknown 15406 1726854935.83138: variable 'ansible_search_path' from source: unknown 15406 1726854935.83140: calling self._execute() 15406 1726854935.83268: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.83279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.83298: variable 'omit' from source: magic vars 15406 1726854935.83886: variable 'ansible_distribution' from source: facts 15406 1726854935.83919: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15406 1726854935.84033: variable 'ansible_distribution_major_version' from source: facts 15406 1726854935.84102: Evaluated conditional (ansible_distribution_major_version == '6'): False 15406 1726854935.84105: when evaluation is False, skipping this task 15406 1726854935.84108: _execute() done 15406 1726854935.84110: dumping result to json 15406 1726854935.84113: done dumping result, returning 15406 1726854935.84115: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affcc66-ac2b-3c83-32d3-0000000000b4] 15406 1726854935.84117: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b4 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15406 1726854935.84246: no more pending results, returning what we have 15406 1726854935.84250: results queue empty 15406 1726854935.84251: checking for any_errors_fatal 15406 1726854935.84257: done checking for any_errors_fatal 15406 1726854935.84259: checking for max_fail_percentage 15406 1726854935.84261: done checking for max_fail_percentage 15406 1726854935.84262: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.84263: done checking to see if all hosts have failed 15406 1726854935.84263: getting the remaining hosts for this loop 15406 1726854935.84264: done getting the remaining hosts for this loop 15406 1726854935.84269: getting the next task for host managed_node2 15406 1726854935.84283: done getting next task for host managed_node2 15406 1726854935.84286: ^ task is: TASK: Set network provider to 'nm' 15406 1726854935.84290: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.84295: getting variables 15406 1726854935.84296: in VariableManager get_vars() 15406 1726854935.84334: Calling all_inventory to load vars for managed_node2 15406 1726854935.84337: Calling groups_inventory to load vars for managed_node2 15406 1726854935.84341: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.84355: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.84358: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.84361: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.85025: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000b4 15406 1726854935.85029: WORKER PROCESS EXITING 15406 1726854935.85408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.85994: done with get_vars() 15406 1726854935.86100: done getting variables 15406 1726854935.86454: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Friday 20 September 2024 13:55:35 -0400 (0:00:00.045) 0:00:03.687 ****** 15406 1726854935.86523: entering _queue_task() for managed_node2/set_fact 15406 1726854935.87440: worker is 1 (out of 1 available) 15406 1726854935.87453: exiting _queue_task() for managed_node2/set_fact 15406 1726854935.87753: done queuing things up, now waiting for results queue to drain 15406 1726854935.87755: waiting for pending results... 15406 1726854935.88160: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 15406 1726854935.88542: in run() - task 0affcc66-ac2b-3c83-32d3-000000000007 15406 1726854935.88547: variable 'ansible_search_path' from source: unknown 15406 1726854935.88573: calling self._execute() 15406 1726854935.88766: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.88795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.88860: variable 'omit' from source: magic vars 15406 1726854935.89313: variable 'omit' from source: magic vars 15406 1726854935.89317: variable 'omit' from source: magic vars 15406 1726854935.89615: variable 'omit' from source: magic vars 15406 1726854935.89773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854935.89829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854935.90208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854935.90212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854935.90214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854935.90216: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854935.90219: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.90221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.90223: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854935.90225: Set connection var ansible_timeout to 10 15406 1726854935.90227: Set connection var ansible_connection to ssh 15406 1726854935.90229: Set connection var ansible_shell_type to sh 15406 1726854935.90765: Set connection var ansible_shell_executable to /bin/sh 15406 1726854935.91044: Set connection var ansible_pipelining to False 15406 1726854935.91048: variable 'ansible_shell_executable' from source: unknown 15406 1726854935.91050: variable 'ansible_connection' from source: unknown 15406 1726854935.91052: variable 'ansible_module_compression' from source: unknown 15406 1726854935.91054: variable 'ansible_shell_type' from source: unknown 15406 1726854935.91058: variable 'ansible_shell_executable' from source: unknown 15406 1726854935.91061: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854935.91063: variable 'ansible_pipelining' from source: unknown 15406 1726854935.91065: variable 'ansible_timeout' from source: unknown 15406 1726854935.91067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854935.91846: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854935.91890: variable 'omit' from source: magic vars 15406 1726854935.91931: starting attempt loop 15406 1726854935.92003: running the handler 15406 1726854935.92022: handler run complete 15406 1726854935.92168: attempt loop complete, returning result 15406 1726854935.92171: _execute() done 15406 1726854935.92174: dumping result to json 15406 1726854935.92177: done dumping result, returning 15406 1726854935.92179: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affcc66-ac2b-3c83-32d3-000000000007] 15406 1726854935.92181: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15406 1726854935.92889: no more pending results, returning what we have 15406 1726854935.92892: results queue empty 15406 1726854935.92893: checking for any_errors_fatal 15406 1726854935.92899: done checking for any_errors_fatal 15406 1726854935.92900: checking for max_fail_percentage 15406 1726854935.92902: done checking for max_fail_percentage 15406 1726854935.92903: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.92904: done checking to see if all hosts have failed 15406 1726854935.92910: getting the remaining hosts for this loop 15406 1726854935.92912: done getting the remaining hosts for this loop 15406 1726854935.92916: getting the next task for host managed_node2 15406 1726854935.93125: done getting next task for host managed_node2 15406 1726854935.93128: ^ task is: TASK: meta (flush_handlers) 15406 1726854935.93131: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.93135: getting variables 15406 1726854935.93137: in VariableManager get_vars() 15406 1726854935.93254: Calling all_inventory to load vars for managed_node2 15406 1726854935.93257: Calling groups_inventory to load vars for managed_node2 15406 1726854935.93261: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.93271: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.93274: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.93277: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.93726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.94609: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000007 15406 1726854935.94613: WORKER PROCESS EXITING 15406 1726854935.94638: done with get_vars() 15406 1726854935.94647: done getting variables 15406 1726854935.94721: in VariableManager get_vars() 15406 1726854935.94730: Calling all_inventory to load vars for managed_node2 15406 1726854935.94732: Calling groups_inventory to load vars for managed_node2 15406 1726854935.94734: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.94739: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.94742: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.94745: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.95060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.95367: done with get_vars() 15406 1726854935.95383: done queuing things up, now waiting for results queue to drain 15406 1726854935.95385: results queue empty 15406 1726854935.95386: checking for any_errors_fatal 15406 1726854935.95534: done checking for any_errors_fatal 15406 1726854935.95535: checking for max_fail_percentage 15406 1726854935.95537: done checking for max_fail_percentage 15406 1726854935.95537: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.95538: done checking to see if all hosts have failed 15406 1726854935.95539: getting the remaining hosts for this loop 15406 1726854935.95540: done getting the remaining hosts for this loop 15406 1726854935.95542: getting the next task for host managed_node2 15406 1726854935.95546: done getting next task for host managed_node2 15406 1726854935.95547: ^ task is: TASK: meta (flush_handlers) 15406 1726854935.95549: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.95556: getting variables 15406 1726854935.95557: in VariableManager get_vars() 15406 1726854935.95565: Calling all_inventory to load vars for managed_node2 15406 1726854935.95567: Calling groups_inventory to load vars for managed_node2 15406 1726854935.95570: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.95574: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.95576: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.95579: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.95959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.96406: done with get_vars() 15406 1726854935.96415: done getting variables 15406 1726854935.96595: in VariableManager get_vars() 15406 1726854935.96605: Calling all_inventory to load vars for managed_node2 15406 1726854935.96607: Calling groups_inventory to load vars for managed_node2 15406 1726854935.96609: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.96614: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.96617: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.96619: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.96876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.97607: done with get_vars() 15406 1726854935.97677: done queuing things up, now waiting for results queue to drain 15406 1726854935.97679: results queue empty 15406 1726854935.97680: checking for any_errors_fatal 15406 1726854935.97681: done checking for any_errors_fatal 15406 1726854935.97681: checking for max_fail_percentage 15406 1726854935.97682: done checking for max_fail_percentage 15406 1726854935.97683: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.97684: done checking to see if all hosts have failed 15406 1726854935.97684: getting the remaining hosts for this loop 15406 1726854935.97685: done getting the remaining hosts for this loop 15406 1726854935.97728: getting the next task for host managed_node2 15406 1726854935.97732: done getting next task for host managed_node2 15406 1726854935.97733: ^ task is: None 15406 1726854935.97735: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.97736: done queuing things up, now waiting for results queue to drain 15406 1726854935.97737: results queue empty 15406 1726854935.97737: checking for any_errors_fatal 15406 1726854935.97738: done checking for any_errors_fatal 15406 1726854935.97739: checking for max_fail_percentage 15406 1726854935.97739: done checking for max_fail_percentage 15406 1726854935.97740: checking to see if all hosts have failed and the running result is not ok 15406 1726854935.97741: done checking to see if all hosts have failed 15406 1726854935.97742: getting the next task for host managed_node2 15406 1726854935.97745: done getting next task for host managed_node2 15406 1726854935.97745: ^ task is: None 15406 1726854935.97746: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.97852: in VariableManager get_vars() 15406 1726854935.97869: done with get_vars() 15406 1726854935.97876: in VariableManager get_vars() 15406 1726854935.97944: done with get_vars() 15406 1726854935.97950: variable 'omit' from source: magic vars 15406 1726854935.97980: in VariableManager get_vars() 15406 1726854935.98057: done with get_vars() 15406 1726854935.98084: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15406 1726854935.98570: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854935.98701: getting the remaining hosts for this loop 15406 1726854935.98703: done getting the remaining hosts for this loop 15406 1726854935.98709: getting the next task for host managed_node2 15406 1726854935.98712: done getting next task for host managed_node2 15406 1726854935.98714: ^ task is: TASK: Gathering Facts 15406 1726854935.98720: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854935.98722: getting variables 15406 1726854935.98723: in VariableManager get_vars() 15406 1726854935.98731: Calling all_inventory to load vars for managed_node2 15406 1726854935.98733: Calling groups_inventory to load vars for managed_node2 15406 1726854935.98739: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854935.98786: Calling all_plugins_play to load vars for managed_node2 15406 1726854935.98803: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854935.98834: Calling groups_plugins_play to load vars for managed_node2 15406 1726854935.99207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854935.99449: done with get_vars() 15406 1726854935.99455: done getting variables 15406 1726854935.99705: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Friday 20 September 2024 13:55:35 -0400 (0:00:00.134) 0:00:03.822 ****** 15406 1726854935.99985: entering _queue_task() for managed_node2/gather_facts 15406 1726854936.00475: worker is 1 (out of 1 available) 15406 1726854936.00485: exiting _queue_task() for managed_node2/gather_facts 15406 1726854936.00548: done queuing things up, now waiting for results queue to drain 15406 1726854936.00550: waiting for pending results... 15406 1726854936.00811: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854936.00980: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000da 15406 1726854936.01166: variable 'ansible_search_path' from source: unknown 15406 1726854936.01175: calling self._execute() 15406 1726854936.01469: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.01473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.01475: variable 'omit' from source: magic vars 15406 1726854936.01971: variable 'ansible_distribution_major_version' from source: facts 15406 1726854936.01975: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854936.01977: variable 'omit' from source: magic vars 15406 1726854936.01978: variable 'omit' from source: magic vars 15406 1726854936.01981: variable 'omit' from source: magic vars 15406 1726854936.02023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854936.02062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854936.02094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854936.02120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854936.02190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854936.02193: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854936.02195: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.02198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.02293: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854936.02307: Set connection var ansible_timeout to 10 15406 1726854936.02319: Set connection var ansible_connection to ssh 15406 1726854936.02330: Set connection var ansible_shell_type to sh 15406 1726854936.02340: Set connection var ansible_shell_executable to /bin/sh 15406 1726854936.02351: Set connection var ansible_pipelining to False 15406 1726854936.02375: variable 'ansible_shell_executable' from source: unknown 15406 1726854936.02382: variable 'ansible_connection' from source: unknown 15406 1726854936.02392: variable 'ansible_module_compression' from source: unknown 15406 1726854936.02427: variable 'ansible_shell_type' from source: unknown 15406 1726854936.02430: variable 'ansible_shell_executable' from source: unknown 15406 1726854936.02432: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.02435: variable 'ansible_pipelining' from source: unknown 15406 1726854936.02437: variable 'ansible_timeout' from source: unknown 15406 1726854936.02439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.02690: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854936.02696: variable 'omit' from source: magic vars 15406 1726854936.02699: starting attempt loop 15406 1726854936.02701: running the handler 15406 1726854936.02704: variable 'ansible_facts' from source: unknown 15406 1726854936.02728: _low_level_execute_command(): starting 15406 1726854936.02731: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854936.03579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854936.03619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854936.03718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854936.03745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854936.03760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854936.03862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15406 1726854936.05969: stdout chunk (state=3): >>>/root <<< 15406 1726854936.06109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854936.06153: stderr chunk (state=3): >>><<< 15406 1726854936.06163: stdout chunk (state=3): >>><<< 15406 1726854936.06291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15406 1726854936.06319: _low_level_execute_command(): starting 15406 1726854936.06323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758 `" && echo ansible-tmp-1726854936.0619686-15627-127736849919758="` echo /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758 `" ) && sleep 0' 15406 1726854936.06905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854936.06974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854936.07009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854936.07123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15406 1726854936.09802: stdout chunk (state=3): >>>ansible-tmp-1726854936.0619686-15627-127736849919758=/root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758 <<< 15406 1726854936.10001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854936.10037: stdout chunk (state=3): >>><<< 15406 1726854936.10040: stderr chunk (state=3): >>><<< 15406 1726854936.10060: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854936.0619686-15627-127736849919758=/root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15406 1726854936.10193: variable 'ansible_module_compression' from source: unknown 15406 1726854936.10196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854936.10235: variable 'ansible_facts' from source: unknown 15406 1726854936.10470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py 15406 1726854936.10671: Sending initial data 15406 1726854936.10674: Sent initial data (154 bytes) 15406 1726854936.11393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854936.11419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854936.11481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854936.11507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854936.11637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15406 1726854936.13938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854936.14010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854936.14084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp_7yi_d79 /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py <<< 15406 1726854936.14114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py" <<< 15406 1726854936.14339: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp_7yi_d79" to remote "/root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py" <<< 15406 1726854936.17749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854936.17804: stderr chunk (state=3): >>><<< 15406 1726854936.17817: stdout chunk (state=3): >>><<< 15406 1726854936.17928: done transferring module to remote 15406 1726854936.17931: _low_level_execute_command(): starting 15406 1726854936.17933: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/ /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py && sleep 0' 15406 1726854936.19133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854936.19211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854936.19216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854936.19219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854936.19223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854936.19293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854936.19600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854936.19670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15406 1726854936.22294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854936.22302: stdout chunk (state=3): >>><<< 15406 1726854936.22305: stderr chunk (state=3): >>><<< 15406 1726854936.22308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15406 1726854936.22310: _low_level_execute_command(): starting 15406 1726854936.22313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/AnsiballZ_setup.py && sleep 0' 15406 1726854936.23713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854936.23730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854936.23747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854936.23859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15406 1726854936.89217: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.544921875, "5m": 0.3798828125, "15m": 0.18359375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2941, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 590, "free": 2941}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 719, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797167104, "block_size": 4096, "block_total": 65519099, "block_available": 63915324, "block_used": 1603775, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "36", "epoch": "1726854936", "epoch_int": "1726854936", "date": "2024-09-20", "time": "13:55:36", "iso8601_micro": "2024-09-20T17:55:36.851943Z", "iso8601": "2024-09-20T17:55:36Z", "iso8601_basic": "20240920T135536851943", "iso8601_basic_short": "20240920T135536", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": <<< 15406 1726854936.89241: stdout chunk (state=3): >>>"off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854936.91091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854936.91137: stderr chunk (state=3): >>><<< 15406 1726854936.91158: stdout chunk (state=3): >>><<< 15406 1726854936.91206: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.544921875, "5m": 0.3798828125, "15m": 0.18359375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2941, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 590, "free": 2941}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 719, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797167104, "block_size": 4096, "block_total": 65519099, "block_available": 63915324, "block_used": 1603775, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "36", "epoch": "1726854936", "epoch_int": "1726854936", "date": "2024-09-20", "time": "13:55:36", "iso8601_micro": "2024-09-20T17:55:36.851943Z", "iso8601": "2024-09-20T17:55:36Z", "iso8601_basic": "20240920T135536851943", "iso8601_basic_short": "20240920T135536", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854936.91604: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854936.91608: _low_level_execute_command(): starting 15406 1726854936.91611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854936.0619686-15627-127736849919758/ > /dev/null 2>&1 && sleep 0' 15406 1726854936.92273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854936.92282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854936.92305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854936.92400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854936.94286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854936.94291: stdout chunk (state=3): >>><<< 15406 1726854936.94294: stderr chunk (state=3): >>><<< 15406 1726854936.94306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854936.94313: handler run complete 15406 1726854936.94396: variable 'ansible_facts' from source: unknown 15406 1726854936.94479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854936.94664: variable 'ansible_facts' from source: unknown 15406 1726854936.94721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854936.94798: attempt loop complete, returning result 15406 1726854936.94802: _execute() done 15406 1726854936.94804: dumping result to json 15406 1726854936.94826: done dumping result, returning 15406 1726854936.94833: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-0000000000da] 15406 1726854936.94838: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000da 15406 1726854936.95100: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000da 15406 1726854936.95102: WORKER PROCESS EXITING ok: [managed_node2] 15406 1726854936.95317: no more pending results, returning what we have 15406 1726854936.95325: results queue empty 15406 1726854936.95327: checking for any_errors_fatal 15406 1726854936.95327: done checking for any_errors_fatal 15406 1726854936.95328: checking for max_fail_percentage 15406 1726854936.95329: done checking for max_fail_percentage 15406 1726854936.95330: checking to see if all hosts have failed and the running result is not ok 15406 1726854936.95330: done checking to see if all hosts have failed 15406 1726854936.95331: getting the remaining hosts for this loop 15406 1726854936.95331: done getting the remaining hosts for this loop 15406 1726854936.95334: getting the next task for host managed_node2 15406 1726854936.95338: done getting next task for host managed_node2 15406 1726854936.95339: ^ task is: TASK: meta (flush_handlers) 15406 1726854936.95340: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854936.95343: getting variables 15406 1726854936.95344: in VariableManager get_vars() 15406 1726854936.95360: Calling all_inventory to load vars for managed_node2 15406 1726854936.95362: Calling groups_inventory to load vars for managed_node2 15406 1726854936.95364: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854936.95371: Calling all_plugins_play to load vars for managed_node2 15406 1726854936.95373: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854936.95375: Calling groups_plugins_play to load vars for managed_node2 15406 1726854936.95476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854936.95593: done with get_vars() 15406 1726854936.95601: done getting variables 15406 1726854936.95648: in VariableManager get_vars() 15406 1726854936.95657: Calling all_inventory to load vars for managed_node2 15406 1726854936.95659: Calling groups_inventory to load vars for managed_node2 15406 1726854936.95660: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854936.95663: Calling all_plugins_play to load vars for managed_node2 15406 1726854936.95665: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854936.95666: Calling groups_plugins_play to load vars for managed_node2 15406 1726854936.95757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854936.95864: done with get_vars() 15406 1726854936.95876: done queuing things up, now waiting for results queue to drain 15406 1726854936.95878: results queue empty 15406 1726854936.95878: checking for any_errors_fatal 15406 1726854936.95882: done checking for any_errors_fatal 15406 1726854936.95883: checking for max_fail_percentage 15406 1726854936.95883: done checking for max_fail_percentage 15406 1726854936.95884: checking to see if all hosts have failed and the running result is not ok 15406 1726854936.95885: done checking to see if all hosts have failed 15406 1726854936.95890: getting the remaining hosts for this loop 15406 1726854936.95891: done getting the remaining hosts for this loop 15406 1726854936.95893: getting the next task for host managed_node2 15406 1726854936.95895: done getting next task for host managed_node2 15406 1726854936.95896: ^ task is: TASK: Set interface={{ interface }} 15406 1726854936.95897: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854936.95899: getting variables 15406 1726854936.95899: in VariableManager get_vars() 15406 1726854936.95905: Calling all_inventory to load vars for managed_node2 15406 1726854936.95906: Calling groups_inventory to load vars for managed_node2 15406 1726854936.95908: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854936.95911: Calling all_plugins_play to load vars for managed_node2 15406 1726854936.95912: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854936.95913: Calling groups_plugins_play to load vars for managed_node2 15406 1726854936.95995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854936.96101: done with get_vars() 15406 1726854936.96106: done getting variables 15406 1726854936.96131: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854936.96221: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Friday 20 September 2024 13:55:36 -0400 (0:00:00.962) 0:00:04.785 ****** 15406 1726854936.96250: entering _queue_task() for managed_node2/set_fact 15406 1726854936.96456: worker is 1 (out of 1 available) 15406 1726854936.96471: exiting _queue_task() for managed_node2/set_fact 15406 1726854936.96489: done queuing things up, now waiting for results queue to drain 15406 1726854936.96490: waiting for pending results... 15406 1726854936.96647: running TaskExecutor() for managed_node2/TASK: Set interface=LSR-TST-br31 15406 1726854936.96711: in run() - task 0affcc66-ac2b-3c83-32d3-00000000000b 15406 1726854936.96721: variable 'ansible_search_path' from source: unknown 15406 1726854936.96751: calling self._execute() 15406 1726854936.96817: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.96820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.96833: variable 'omit' from source: magic vars 15406 1726854936.97102: variable 'ansible_distribution_major_version' from source: facts 15406 1726854936.97111: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854936.97117: variable 'omit' from source: magic vars 15406 1726854936.97137: variable 'omit' from source: magic vars 15406 1726854936.97160: variable 'interface' from source: play vars 15406 1726854936.97214: variable 'interface' from source: play vars 15406 1726854936.97227: variable 'omit' from source: magic vars 15406 1726854936.97257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854936.97290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854936.97309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854936.97321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854936.97330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854936.97353: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854936.97356: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.97358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.97434: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854936.97438: Set connection var ansible_timeout to 10 15406 1726854936.97441: Set connection var ansible_connection to ssh 15406 1726854936.97447: Set connection var ansible_shell_type to sh 15406 1726854936.97451: Set connection var ansible_shell_executable to /bin/sh 15406 1726854936.97457: Set connection var ansible_pipelining to False 15406 1726854936.97476: variable 'ansible_shell_executable' from source: unknown 15406 1726854936.97481: variable 'ansible_connection' from source: unknown 15406 1726854936.97484: variable 'ansible_module_compression' from source: unknown 15406 1726854936.97486: variable 'ansible_shell_type' from source: unknown 15406 1726854936.97489: variable 'ansible_shell_executable' from source: unknown 15406 1726854936.97492: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.97501: variable 'ansible_pipelining' from source: unknown 15406 1726854936.97504: variable 'ansible_timeout' from source: unknown 15406 1726854936.97506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.97604: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854936.97615: variable 'omit' from source: magic vars 15406 1726854936.97618: starting attempt loop 15406 1726854936.97621: running the handler 15406 1726854936.97628: handler run complete 15406 1726854936.97638: attempt loop complete, returning result 15406 1726854936.97641: _execute() done 15406 1726854936.97643: dumping result to json 15406 1726854936.97646: done dumping result, returning 15406 1726854936.97651: done running TaskExecutor() for managed_node2/TASK: Set interface=LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-00000000000b] 15406 1726854936.97656: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000000b 15406 1726854936.97731: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000000b 15406 1726854936.97734: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15406 1726854936.97781: no more pending results, returning what we have 15406 1726854936.97784: results queue empty 15406 1726854936.97785: checking for any_errors_fatal 15406 1726854936.97796: done checking for any_errors_fatal 15406 1726854936.97797: checking for max_fail_percentage 15406 1726854936.97799: done checking for max_fail_percentage 15406 1726854936.97800: checking to see if all hosts have failed and the running result is not ok 15406 1726854936.97801: done checking to see if all hosts have failed 15406 1726854936.97801: getting the remaining hosts for this loop 15406 1726854936.97802: done getting the remaining hosts for this loop 15406 1726854936.97806: getting the next task for host managed_node2 15406 1726854936.97811: done getting next task for host managed_node2 15406 1726854936.97814: ^ task is: TASK: Include the task 'show_interfaces.yml' 15406 1726854936.97816: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854936.97818: getting variables 15406 1726854936.97820: in VariableManager get_vars() 15406 1726854936.97846: Calling all_inventory to load vars for managed_node2 15406 1726854936.97849: Calling groups_inventory to load vars for managed_node2 15406 1726854936.97852: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854936.97860: Calling all_plugins_play to load vars for managed_node2 15406 1726854936.97863: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854936.97865: Calling groups_plugins_play to load vars for managed_node2 15406 1726854936.98028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854936.98141: done with get_vars() 15406 1726854936.98147: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Friday 20 September 2024 13:55:36 -0400 (0:00:00.019) 0:00:04.804 ****** 15406 1726854936.98217: entering _queue_task() for managed_node2/include_tasks 15406 1726854936.98670: worker is 1 (out of 1 available) 15406 1726854936.98691: exiting _queue_task() for managed_node2/include_tasks 15406 1726854936.98703: done queuing things up, now waiting for results queue to drain 15406 1726854936.98705: waiting for pending results... 15406 1726854936.99103: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 15406 1726854936.99112: in run() - task 0affcc66-ac2b-3c83-32d3-00000000000c 15406 1726854936.99115: variable 'ansible_search_path' from source: unknown 15406 1726854936.99117: calling self._execute() 15406 1726854936.99196: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854936.99221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854936.99237: variable 'omit' from source: magic vars 15406 1726854936.99713: variable 'ansible_distribution_major_version' from source: facts 15406 1726854936.99736: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854936.99806: _execute() done 15406 1726854936.99809: dumping result to json 15406 1726854936.99811: done dumping result, returning 15406 1726854936.99813: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-3c83-32d3-00000000000c] 15406 1726854936.99815: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000000c 15406 1726854936.99918: no more pending results, returning what we have 15406 1726854936.99923: in VariableManager get_vars() 15406 1726854936.99955: Calling all_inventory to load vars for managed_node2 15406 1726854936.99957: Calling groups_inventory to load vars for managed_node2 15406 1726854936.99961: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854936.99973: Calling all_plugins_play to load vars for managed_node2 15406 1726854936.99976: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854936.99978: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.00434: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000000c 15406 1726854937.00437: WORKER PROCESS EXITING 15406 1726854937.00459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.00659: done with get_vars() 15406 1726854937.00667: variable 'ansible_search_path' from source: unknown 15406 1726854937.00682: we have included files to process 15406 1726854937.00683: generating all_blocks data 15406 1726854937.00684: done generating all_blocks data 15406 1726854937.00685: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15406 1726854937.00685: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15406 1726854937.00689: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15406 1726854937.00820: in VariableManager get_vars() 15406 1726854937.00839: done with get_vars() 15406 1726854937.00943: done processing included file 15406 1726854937.00945: iterating over new_blocks loaded from include file 15406 1726854937.00946: in VariableManager get_vars() 15406 1726854937.00958: done with get_vars() 15406 1726854937.00959: filtering new block on tags 15406 1726854937.00980: done filtering new block on tags 15406 1726854937.00983: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 15406 1726854937.00989: extending task lists for all hosts with included blocks 15406 1726854937.01032: done extending task lists 15406 1726854937.01033: done processing included files 15406 1726854937.01034: results queue empty 15406 1726854937.01034: checking for any_errors_fatal 15406 1726854937.01037: done checking for any_errors_fatal 15406 1726854937.01038: checking for max_fail_percentage 15406 1726854937.01039: done checking for max_fail_percentage 15406 1726854937.01039: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.01040: done checking to see if all hosts have failed 15406 1726854937.01040: getting the remaining hosts for this loop 15406 1726854937.01041: done getting the remaining hosts for this loop 15406 1726854937.01042: getting the next task for host managed_node2 15406 1726854937.01045: done getting next task for host managed_node2 15406 1726854937.01047: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15406 1726854937.01048: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.01050: getting variables 15406 1726854937.01050: in VariableManager get_vars() 15406 1726854937.01057: Calling all_inventory to load vars for managed_node2 15406 1726854937.01059: Calling groups_inventory to load vars for managed_node2 15406 1726854937.01060: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.01064: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.01066: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.01070: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.01198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.01304: done with get_vars() 15406 1726854937.01311: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:55:37 -0400 (0:00:00.031) 0:00:04.836 ****** 15406 1726854937.01390: entering _queue_task() for managed_node2/include_tasks 15406 1726854937.01597: worker is 1 (out of 1 available) 15406 1726854937.01610: exiting _queue_task() for managed_node2/include_tasks 15406 1726854937.01623: done queuing things up, now waiting for results queue to drain 15406 1726854937.01624: waiting for pending results... 15406 1726854937.01829: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 15406 1726854937.01897: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000ee 15406 1726854937.01908: variable 'ansible_search_path' from source: unknown 15406 1726854937.01911: variable 'ansible_search_path' from source: unknown 15406 1726854937.01937: calling self._execute() 15406 1726854937.02016: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.02020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.02028: variable 'omit' from source: magic vars 15406 1726854937.02308: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.02320: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.02323: _execute() done 15406 1726854937.02327: dumping result to json 15406 1726854937.02329: done dumping result, returning 15406 1726854937.02334: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-3c83-32d3-0000000000ee] 15406 1726854937.02339: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000ee 15406 1726854937.02417: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000ee 15406 1726854937.02421: WORKER PROCESS EXITING 15406 1726854937.02446: no more pending results, returning what we have 15406 1726854937.02450: in VariableManager get_vars() 15406 1726854937.02480: Calling all_inventory to load vars for managed_node2 15406 1726854937.02483: Calling groups_inventory to load vars for managed_node2 15406 1726854937.02486: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.02499: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.02501: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.02504: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.02639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.02750: done with get_vars() 15406 1726854937.02755: variable 'ansible_search_path' from source: unknown 15406 1726854937.02755: variable 'ansible_search_path' from source: unknown 15406 1726854937.02779: we have included files to process 15406 1726854937.02779: generating all_blocks data 15406 1726854937.02783: done generating all_blocks data 15406 1726854937.02784: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15406 1726854937.02784: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15406 1726854937.02786: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15406 1726854937.03010: done processing included file 15406 1726854937.03011: iterating over new_blocks loaded from include file 15406 1726854937.03012: in VariableManager get_vars() 15406 1726854937.03022: done with get_vars() 15406 1726854937.03023: filtering new block on tags 15406 1726854937.03033: done filtering new block on tags 15406 1726854937.03035: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 15406 1726854937.03038: extending task lists for all hosts with included blocks 15406 1726854937.03096: done extending task lists 15406 1726854937.03097: done processing included files 15406 1726854937.03098: results queue empty 15406 1726854937.03098: checking for any_errors_fatal 15406 1726854937.03100: done checking for any_errors_fatal 15406 1726854937.03100: checking for max_fail_percentage 15406 1726854937.03101: done checking for max_fail_percentage 15406 1726854937.03102: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.03102: done checking to see if all hosts have failed 15406 1726854937.03103: getting the remaining hosts for this loop 15406 1726854937.03103: done getting the remaining hosts for this loop 15406 1726854937.03105: getting the next task for host managed_node2 15406 1726854937.03108: done getting next task for host managed_node2 15406 1726854937.03109: ^ task is: TASK: Gather current interface info 15406 1726854937.03111: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.03112: getting variables 15406 1726854937.03113: in VariableManager get_vars() 15406 1726854937.03118: Calling all_inventory to load vars for managed_node2 15406 1726854937.03119: Calling groups_inventory to load vars for managed_node2 15406 1726854937.03121: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.03125: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.03127: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.03130: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.03221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.03331: done with get_vars() 15406 1726854937.03337: done getting variables 15406 1726854937.03364: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:55:37 -0400 (0:00:00.019) 0:00:04.856 ****** 15406 1726854937.03385: entering _queue_task() for managed_node2/command 15406 1726854937.03569: worker is 1 (out of 1 available) 15406 1726854937.03584: exiting _queue_task() for managed_node2/command 15406 1726854937.03597: done queuing things up, now waiting for results queue to drain 15406 1726854937.03599: waiting for pending results... 15406 1726854937.03739: running TaskExecutor() for managed_node2/TASK: Gather current interface info 15406 1726854937.03808: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000fd 15406 1726854937.03817: variable 'ansible_search_path' from source: unknown 15406 1726854937.03824: variable 'ansible_search_path' from source: unknown 15406 1726854937.03850: calling self._execute() 15406 1726854937.03931: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.03937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.04002: variable 'omit' from source: magic vars 15406 1726854937.04266: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.04295: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.04345: variable 'omit' from source: magic vars 15406 1726854937.04348: variable 'omit' from source: magic vars 15406 1726854937.04372: variable 'omit' from source: magic vars 15406 1726854937.04405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854937.04441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854937.04450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854937.04463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.04473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.04498: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854937.04502: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.04504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.04659: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854937.04663: Set connection var ansible_timeout to 10 15406 1726854937.04666: Set connection var ansible_connection to ssh 15406 1726854937.04668: Set connection var ansible_shell_type to sh 15406 1726854937.04671: Set connection var ansible_shell_executable to /bin/sh 15406 1726854937.04674: Set connection var ansible_pipelining to False 15406 1726854937.04676: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.04678: variable 'ansible_connection' from source: unknown 15406 1726854937.04683: variable 'ansible_module_compression' from source: unknown 15406 1726854937.04685: variable 'ansible_shell_type' from source: unknown 15406 1726854937.04689: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.04691: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.04693: variable 'ansible_pipelining' from source: unknown 15406 1726854937.04695: variable 'ansible_timeout' from source: unknown 15406 1726854937.04697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.04742: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854937.04750: variable 'omit' from source: magic vars 15406 1726854937.04754: starting attempt loop 15406 1726854937.04762: running the handler 15406 1726854937.04806: _low_level_execute_command(): starting 15406 1726854937.04809: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854937.05336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854937.05339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854937.05342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.05345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854937.05347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.05397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.05401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.05484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.07111: stdout chunk (state=3): >>>/root <<< 15406 1726854937.07219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.07247: stderr chunk (state=3): >>><<< 15406 1726854937.07250: stdout chunk (state=3): >>><<< 15406 1726854937.07275: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854937.07291: _low_level_execute_command(): starting 15406 1726854937.07319: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778 `" && echo ansible-tmp-1726854937.0726979-15689-131969987032778="` echo /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778 `" ) && sleep 0' 15406 1726854937.07898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854937.07917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854937.07937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854937.07956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854937.07974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854937.08212: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.08221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.08244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.08281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.08418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.10286: stdout chunk (state=3): >>>ansible-tmp-1726854937.0726979-15689-131969987032778=/root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778 <<< 15406 1726854937.10392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.10440: stderr chunk (state=3): >>><<< 15406 1726854937.10443: stdout chunk (state=3): >>><<< 15406 1726854937.10471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854937.0726979-15689-131969987032778=/root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854937.10490: variable 'ansible_module_compression' from source: unknown 15406 1726854937.10536: ANSIBALLZ: Using generic lock for ansible.legacy.command 15406 1726854937.10539: ANSIBALLZ: Acquiring lock 15406 1726854937.10542: ANSIBALLZ: Lock acquired: 140626835985552 15406 1726854937.10544: ANSIBALLZ: Creating module 15406 1726854937.33645: ANSIBALLZ: Writing module into payload 15406 1726854937.33797: ANSIBALLZ: Writing module 15406 1726854937.33802: ANSIBALLZ: Renaming module 15406 1726854937.33813: ANSIBALLZ: Done creating module 15406 1726854937.33834: variable 'ansible_facts' from source: unknown 15406 1726854937.34013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py 15406 1726854937.34144: Sending initial data 15406 1726854937.34147: Sent initial data (156 bytes) 15406 1726854937.34723: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854937.34739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854937.34817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854937.34834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854937.35039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.35108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.35226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.35365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.37024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854937.37223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854937.37316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp16jpq4qb /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py <<< 15406 1726854937.37319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py" <<< 15406 1726854937.37367: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp16jpq4qb" to remote "/root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py" <<< 15406 1726854937.38876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.38883: stdout chunk (state=3): >>><<< 15406 1726854937.38885: stderr chunk (state=3): >>><<< 15406 1726854937.39070: done transferring module to remote 15406 1726854937.39079: _low_level_execute_command(): starting 15406 1726854937.39104: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/ /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py && sleep 0' 15406 1726854937.40801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.41013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.41205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.42948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.42952: stderr chunk (state=3): >>><<< 15406 1726854937.43104: stdout chunk (state=3): >>><<< 15406 1726854937.43121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854937.43125: _low_level_execute_command(): starting 15406 1726854937.43129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/AnsiballZ_command.py && sleep 0' 15406 1726854937.44629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.44633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.44636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.44638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.60017: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:55:37.594840", "end": "2024-09-20 13:55:37.598148", "delta": "0:00:00.003308", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15406 1726854937.61511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854937.61560: stderr chunk (state=3): >>><<< 15406 1726854937.61579: stdout chunk (state=3): >>><<< 15406 1726854937.61602: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:55:37.594840", "end": "2024-09-20 13:55:37.598148", "delta": "0:00:00.003308", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854937.61644: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854937.61652: _low_level_execute_command(): starting 15406 1726854937.61658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854937.0726979-15689-131969987032778/ > /dev/null 2>&1 && sleep 0' 15406 1726854937.62312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854937.62322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854937.62345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854937.62360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854937.62374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854937.62393: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854937.62396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.62493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854937.62497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 15406 1726854937.62499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15406 1726854937.62501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.62518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.62542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.62570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.62675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.64591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.64595: stdout chunk (state=3): >>><<< 15406 1726854937.64793: stderr chunk (state=3): >>><<< 15406 1726854937.64799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854937.64802: handler run complete 15406 1726854937.64804: Evaluated conditional (False): False 15406 1726854937.64806: attempt loop complete, returning result 15406 1726854937.64808: _execute() done 15406 1726854937.64809: dumping result to json 15406 1726854937.64811: done dumping result, returning 15406 1726854937.64813: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcc66-ac2b-3c83-32d3-0000000000fd] 15406 1726854937.64815: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000fd 15406 1726854937.64890: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000fd 15406 1726854937.64894: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003308", "end": "2024-09-20 13:55:37.598148", "rc": 0, "start": "2024-09-20 13:55:37.594840" } STDOUT: bonding_masters eth0 lo 15406 1726854937.64968: no more pending results, returning what we have 15406 1726854937.64972: results queue empty 15406 1726854937.64973: checking for any_errors_fatal 15406 1726854937.64975: done checking for any_errors_fatal 15406 1726854937.64975: checking for max_fail_percentage 15406 1726854937.64977: done checking for max_fail_percentage 15406 1726854937.64978: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.64979: done checking to see if all hosts have failed 15406 1726854937.64979: getting the remaining hosts for this loop 15406 1726854937.64981: done getting the remaining hosts for this loop 15406 1726854937.64984: getting the next task for host managed_node2 15406 1726854937.65109: done getting next task for host managed_node2 15406 1726854937.65112: ^ task is: TASK: Set current_interfaces 15406 1726854937.65116: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.65119: getting variables 15406 1726854937.65121: in VariableManager get_vars() 15406 1726854937.65149: Calling all_inventory to load vars for managed_node2 15406 1726854937.65152: Calling groups_inventory to load vars for managed_node2 15406 1726854937.65155: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.65166: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.65169: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.65172: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.65664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.65877: done with get_vars() 15406 1726854937.65889: done getting variables 15406 1726854937.65958: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:55:37 -0400 (0:00:00.626) 0:00:05.482 ****** 15406 1726854937.65989: entering _queue_task() for managed_node2/set_fact 15406 1726854937.66513: worker is 1 (out of 1 available) 15406 1726854937.66522: exiting _queue_task() for managed_node2/set_fact 15406 1726854937.66532: done queuing things up, now waiting for results queue to drain 15406 1726854937.66533: waiting for pending results... 15406 1726854937.66595: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 15406 1726854937.66765: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000fe 15406 1726854937.66768: variable 'ansible_search_path' from source: unknown 15406 1726854937.66771: variable 'ansible_search_path' from source: unknown 15406 1726854937.66774: calling self._execute() 15406 1726854937.66848: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.66869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.66889: variable 'omit' from source: magic vars 15406 1726854937.67285: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.67319: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.67331: variable 'omit' from source: magic vars 15406 1726854937.67380: variable 'omit' from source: magic vars 15406 1726854937.67503: variable '_current_interfaces' from source: set_fact 15406 1726854937.67585: variable 'omit' from source: magic vars 15406 1726854937.67661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854937.67703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854937.67745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854937.67854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.67857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.67860: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854937.67861: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.67864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.67934: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854937.67945: Set connection var ansible_timeout to 10 15406 1726854937.67965: Set connection var ansible_connection to ssh 15406 1726854937.67976: Set connection var ansible_shell_type to sh 15406 1726854937.67989: Set connection var ansible_shell_executable to /bin/sh 15406 1726854937.68003: Set connection var ansible_pipelining to False 15406 1726854937.68030: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.68037: variable 'ansible_connection' from source: unknown 15406 1726854937.68044: variable 'ansible_module_compression' from source: unknown 15406 1726854937.68051: variable 'ansible_shell_type' from source: unknown 15406 1726854937.68057: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.68080: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.68091: variable 'ansible_pipelining' from source: unknown 15406 1726854937.68098: variable 'ansible_timeout' from source: unknown 15406 1726854937.68105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.68252: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854937.68295: variable 'omit' from source: magic vars 15406 1726854937.68298: starting attempt loop 15406 1726854937.68300: running the handler 15406 1726854937.68401: handler run complete 15406 1726854937.68404: attempt loop complete, returning result 15406 1726854937.68406: _execute() done 15406 1726854937.68409: dumping result to json 15406 1726854937.68411: done dumping result, returning 15406 1726854937.68413: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcc66-ac2b-3c83-32d3-0000000000fe] 15406 1726854937.68415: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000fe 15406 1726854937.68478: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000fe 15406 1726854937.68482: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15406 1726854937.68554: no more pending results, returning what we have 15406 1726854937.68558: results queue empty 15406 1726854937.68560: checking for any_errors_fatal 15406 1726854937.68571: done checking for any_errors_fatal 15406 1726854937.68572: checking for max_fail_percentage 15406 1726854937.68574: done checking for max_fail_percentage 15406 1726854937.68575: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.68576: done checking to see if all hosts have failed 15406 1726854937.68577: getting the remaining hosts for this loop 15406 1726854937.68578: done getting the remaining hosts for this loop 15406 1726854937.68582: getting the next task for host managed_node2 15406 1726854937.68593: done getting next task for host managed_node2 15406 1726854937.68596: ^ task is: TASK: Show current_interfaces 15406 1726854937.68599: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.68603: getting variables 15406 1726854937.68604: in VariableManager get_vars() 15406 1726854937.68636: Calling all_inventory to load vars for managed_node2 15406 1726854937.68639: Calling groups_inventory to load vars for managed_node2 15406 1726854937.68643: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.68654: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.68657: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.68660: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.69238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.69646: done with get_vars() 15406 1726854937.69667: done getting variables 15406 1726854937.69763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:55:37 -0400 (0:00:00.038) 0:00:05.520 ****** 15406 1726854937.69798: entering _queue_task() for managed_node2/debug 15406 1726854937.69799: Creating lock for debug 15406 1726854937.70064: worker is 1 (out of 1 available) 15406 1726854937.70078: exiting _queue_task() for managed_node2/debug 15406 1726854937.70107: done queuing things up, now waiting for results queue to drain 15406 1726854937.70109: waiting for pending results... 15406 1726854937.70409: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 15406 1726854937.70466: in run() - task 0affcc66-ac2b-3c83-32d3-0000000000ef 15406 1726854937.70478: variable 'ansible_search_path' from source: unknown 15406 1726854937.70485: variable 'ansible_search_path' from source: unknown 15406 1726854937.70513: calling self._execute() 15406 1726854937.70574: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.70577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.70591: variable 'omit' from source: magic vars 15406 1726854937.70896: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.70906: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.70912: variable 'omit' from source: magic vars 15406 1726854937.70935: variable 'omit' from source: magic vars 15406 1726854937.71004: variable 'current_interfaces' from source: set_fact 15406 1726854937.71024: variable 'omit' from source: magic vars 15406 1726854937.71053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854937.71079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854937.71097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854937.71112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.71121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.71143: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854937.71146: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.71149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.71221: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854937.71224: Set connection var ansible_timeout to 10 15406 1726854937.71227: Set connection var ansible_connection to ssh 15406 1726854937.71229: Set connection var ansible_shell_type to sh 15406 1726854937.71232: Set connection var ansible_shell_executable to /bin/sh 15406 1726854937.71239: Set connection var ansible_pipelining to False 15406 1726854937.71257: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.71260: variable 'ansible_connection' from source: unknown 15406 1726854937.71262: variable 'ansible_module_compression' from source: unknown 15406 1726854937.71265: variable 'ansible_shell_type' from source: unknown 15406 1726854937.71267: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.71269: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.71271: variable 'ansible_pipelining' from source: unknown 15406 1726854937.71273: variable 'ansible_timeout' from source: unknown 15406 1726854937.71278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.71374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854937.71385: variable 'omit' from source: magic vars 15406 1726854937.71391: starting attempt loop 15406 1726854937.71394: running the handler 15406 1726854937.71428: handler run complete 15406 1726854937.71443: attempt loop complete, returning result 15406 1726854937.71446: _execute() done 15406 1726854937.71448: dumping result to json 15406 1726854937.71451: done dumping result, returning 15406 1726854937.71454: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcc66-ac2b-3c83-32d3-0000000000ef] 15406 1726854937.71456: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000ef 15406 1726854937.71535: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000000ef 15406 1726854937.71540: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15406 1726854937.71586: no more pending results, returning what we have 15406 1726854937.71591: results queue empty 15406 1726854937.71592: checking for any_errors_fatal 15406 1726854937.71596: done checking for any_errors_fatal 15406 1726854937.71596: checking for max_fail_percentage 15406 1726854937.71598: done checking for max_fail_percentage 15406 1726854937.71599: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.71600: done checking to see if all hosts have failed 15406 1726854937.71600: getting the remaining hosts for this loop 15406 1726854937.71601: done getting the remaining hosts for this loop 15406 1726854937.71605: getting the next task for host managed_node2 15406 1726854937.71611: done getting next task for host managed_node2 15406 1726854937.71614: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15406 1726854937.71616: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.71618: getting variables 15406 1726854937.71620: in VariableManager get_vars() 15406 1726854937.71643: Calling all_inventory to load vars for managed_node2 15406 1726854937.71646: Calling groups_inventory to load vars for managed_node2 15406 1726854937.71648: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.71658: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.71660: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.71663: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.71833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.71944: done with get_vars() 15406 1726854937.71951: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Friday 20 September 2024 13:55:37 -0400 (0:00:00.022) 0:00:05.542 ****** 15406 1726854937.72010: entering _queue_task() for managed_node2/include_tasks 15406 1726854937.72245: worker is 1 (out of 1 available) 15406 1726854937.72259: exiting _queue_task() for managed_node2/include_tasks 15406 1726854937.72272: done queuing things up, now waiting for results queue to drain 15406 1726854937.72274: waiting for pending results... 15406 1726854937.72538: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 15406 1726854937.72899: in run() - task 0affcc66-ac2b-3c83-32d3-00000000000d 15406 1726854937.72903: variable 'ansible_search_path' from source: unknown 15406 1726854937.72905: calling self._execute() 15406 1726854937.72953: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.73015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.73028: variable 'omit' from source: magic vars 15406 1726854937.73656: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.73666: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.73674: _execute() done 15406 1726854937.73676: dumping result to json 15406 1726854937.73679: done dumping result, returning 15406 1726854937.73688: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0affcc66-ac2b-3c83-32d3-00000000000d] 15406 1726854937.73710: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000000d 15406 1726854937.73829: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000000d 15406 1726854937.73895: no more pending results, returning what we have 15406 1726854937.73901: in VariableManager get_vars() 15406 1726854937.73934: Calling all_inventory to load vars for managed_node2 15406 1726854937.73937: Calling groups_inventory to load vars for managed_node2 15406 1726854937.73941: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.73954: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.73957: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.73961: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.74454: WORKER PROCESS EXITING 15406 1726854937.74478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.74664: done with get_vars() 15406 1726854937.74672: variable 'ansible_search_path' from source: unknown 15406 1726854937.74685: we have included files to process 15406 1726854937.74686: generating all_blocks data 15406 1726854937.74689: done generating all_blocks data 15406 1726854937.74694: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15406 1726854937.74695: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15406 1726854937.74698: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15406 1726854937.74842: in VariableManager get_vars() 15406 1726854937.74856: done with get_vars() 15406 1726854937.75013: done processing included file 15406 1726854937.75015: iterating over new_blocks loaded from include file 15406 1726854937.75016: in VariableManager get_vars() 15406 1726854937.75027: done with get_vars() 15406 1726854937.75028: filtering new block on tags 15406 1726854937.75043: done filtering new block on tags 15406 1726854937.75045: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 15406 1726854937.75050: extending task lists for all hosts with included blocks 15406 1726854937.75196: done extending task lists 15406 1726854937.75198: done processing included files 15406 1726854937.75199: results queue empty 15406 1726854937.75199: checking for any_errors_fatal 15406 1726854937.75203: done checking for any_errors_fatal 15406 1726854937.75204: checking for max_fail_percentage 15406 1726854937.75205: done checking for max_fail_percentage 15406 1726854937.75206: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.75207: done checking to see if all hosts have failed 15406 1726854937.75207: getting the remaining hosts for this loop 15406 1726854937.75208: done getting the remaining hosts for this loop 15406 1726854937.75211: getting the next task for host managed_node2 15406 1726854937.75215: done getting next task for host managed_node2 15406 1726854937.75217: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15406 1726854937.75219: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.75221: getting variables 15406 1726854937.75222: in VariableManager get_vars() 15406 1726854937.75230: Calling all_inventory to load vars for managed_node2 15406 1726854937.75232: Calling groups_inventory to load vars for managed_node2 15406 1726854937.75234: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.75239: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.75241: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.75243: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.75374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.75553: done with get_vars() 15406 1726854937.75562: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:55:37 -0400 (0:00:00.036) 0:00:05.579 ****** 15406 1726854937.75629: entering _queue_task() for managed_node2/include_tasks 15406 1726854937.76046: worker is 1 (out of 1 available) 15406 1726854937.76055: exiting _queue_task() for managed_node2/include_tasks 15406 1726854937.76065: done queuing things up, now waiting for results queue to drain 15406 1726854937.76067: waiting for pending results... 15406 1726854937.76208: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 15406 1726854937.76352: in run() - task 0affcc66-ac2b-3c83-32d3-000000000119 15406 1726854937.76367: variable 'ansible_search_path' from source: unknown 15406 1726854937.76375: variable 'ansible_search_path' from source: unknown 15406 1726854937.76415: calling self._execute() 15406 1726854937.76486: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.76502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.76515: variable 'omit' from source: magic vars 15406 1726854937.76861: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.76876: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.76885: _execute() done 15406 1726854937.76894: dumping result to json 15406 1726854937.76902: done dumping result, returning 15406 1726854937.76910: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-3c83-32d3-000000000119] 15406 1726854937.76918: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000119 15406 1726854937.77011: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000119 15406 1726854937.77018: WORKER PROCESS EXITING 15406 1726854937.77063: no more pending results, returning what we have 15406 1726854937.77068: in VariableManager get_vars() 15406 1726854937.77103: Calling all_inventory to load vars for managed_node2 15406 1726854937.77106: Calling groups_inventory to load vars for managed_node2 15406 1726854937.77110: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.77122: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.77125: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.77128: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.77503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.77689: done with get_vars() 15406 1726854937.77697: variable 'ansible_search_path' from source: unknown 15406 1726854937.77698: variable 'ansible_search_path' from source: unknown 15406 1726854937.77730: we have included files to process 15406 1726854937.77731: generating all_blocks data 15406 1726854937.77733: done generating all_blocks data 15406 1726854937.77734: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854937.77735: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854937.77737: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854937.77944: done processing included file 15406 1726854937.77946: iterating over new_blocks loaded from include file 15406 1726854937.77948: in VariableManager get_vars() 15406 1726854937.77959: done with get_vars() 15406 1726854937.77961: filtering new block on tags 15406 1726854937.77974: done filtering new block on tags 15406 1726854937.77976: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 15406 1726854937.77980: extending task lists for all hosts with included blocks 15406 1726854937.78074: done extending task lists 15406 1726854937.78075: done processing included files 15406 1726854937.78076: results queue empty 15406 1726854937.78077: checking for any_errors_fatal 15406 1726854937.78079: done checking for any_errors_fatal 15406 1726854937.78080: checking for max_fail_percentage 15406 1726854937.78081: done checking for max_fail_percentage 15406 1726854937.78082: checking to see if all hosts have failed and the running result is not ok 15406 1726854937.78083: done checking to see if all hosts have failed 15406 1726854937.78083: getting the remaining hosts for this loop 15406 1726854937.78084: done getting the remaining hosts for this loop 15406 1726854937.78088: getting the next task for host managed_node2 15406 1726854937.78093: done getting next task for host managed_node2 15406 1726854937.78095: ^ task is: TASK: Get stat for interface {{ interface }} 15406 1726854937.78098: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854937.78100: getting variables 15406 1726854937.78101: in VariableManager get_vars() 15406 1726854937.78108: Calling all_inventory to load vars for managed_node2 15406 1726854937.78110: Calling groups_inventory to load vars for managed_node2 15406 1726854937.78113: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854937.78117: Calling all_plugins_play to load vars for managed_node2 15406 1726854937.78119: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854937.78122: Calling groups_plugins_play to load vars for managed_node2 15406 1726854937.78258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854937.78433: done with get_vars() 15406 1726854937.78441: done getting variables 15406 1726854937.78615: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:55:37 -0400 (0:00:00.030) 0:00:05.609 ****** 15406 1726854937.78641: entering _queue_task() for managed_node2/stat 15406 1726854937.79194: worker is 1 (out of 1 available) 15406 1726854937.79205: exiting _queue_task() for managed_node2/stat 15406 1726854937.79215: done queuing things up, now waiting for results queue to drain 15406 1726854937.79217: waiting for pending results... 15406 1726854937.79991: running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 15406 1726854937.80117: in run() - task 0affcc66-ac2b-3c83-32d3-000000000133 15406 1726854937.80134: variable 'ansible_search_path' from source: unknown 15406 1726854937.80146: variable 'ansible_search_path' from source: unknown 15406 1726854937.80259: calling self._execute() 15406 1726854937.80478: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.80482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.80586: variable 'omit' from source: magic vars 15406 1726854937.82095: variable 'ansible_distribution_major_version' from source: facts 15406 1726854937.82224: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854937.82236: variable 'omit' from source: magic vars 15406 1726854937.82289: variable 'omit' from source: magic vars 15406 1726854937.82411: variable 'interface' from source: set_fact 15406 1726854937.82614: variable 'omit' from source: magic vars 15406 1726854937.82658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854937.82892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854937.82895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854937.82898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.82900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854937.82901: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854937.82903: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.82905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.83013: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854937.83031: Set connection var ansible_timeout to 10 15406 1726854937.83038: Set connection var ansible_connection to ssh 15406 1726854937.83047: Set connection var ansible_shell_type to sh 15406 1726854937.83055: Set connection var ansible_shell_executable to /bin/sh 15406 1726854937.83065: Set connection var ansible_pipelining to False 15406 1726854937.83095: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.83104: variable 'ansible_connection' from source: unknown 15406 1726854937.83110: variable 'ansible_module_compression' from source: unknown 15406 1726854937.83116: variable 'ansible_shell_type' from source: unknown 15406 1726854937.83122: variable 'ansible_shell_executable' from source: unknown 15406 1726854937.83131: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854937.83142: variable 'ansible_pipelining' from source: unknown 15406 1726854937.83149: variable 'ansible_timeout' from source: unknown 15406 1726854937.83157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854937.83361: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854937.83375: variable 'omit' from source: magic vars 15406 1726854937.83385: starting attempt loop 15406 1726854937.83395: running the handler 15406 1726854937.83415: _low_level_execute_command(): starting 15406 1726854937.83520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854937.84891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.84944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.84992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.85047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.85171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.86871: stdout chunk (state=3): >>>/root <<< 15406 1726854937.87022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.87129: stderr chunk (state=3): >>><<< 15406 1726854937.87147: stdout chunk (state=3): >>><<< 15406 1726854937.87342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854937.87346: _low_level_execute_command(): starting 15406 1726854937.87349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105 `" && echo ansible-tmp-1726854937.872104-15747-157588252447105="` echo /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105 `" ) && sleep 0' 15406 1726854937.88950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.89006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854937.89064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.89090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.89462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.91303: stdout chunk (state=3): >>>ansible-tmp-1726854937.872104-15747-157588252447105=/root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105 <<< 15406 1726854937.91462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.91465: stdout chunk (state=3): >>><<< 15406 1726854937.91467: stderr chunk (state=3): >>><<< 15406 1726854937.91498: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854937.872104-15747-157588252447105=/root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854937.91570: variable 'ansible_module_compression' from source: unknown 15406 1726854937.91622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15406 1726854937.91663: variable 'ansible_facts' from source: unknown 15406 1726854937.91859: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py 15406 1726854937.91931: Sending initial data 15406 1726854937.91989: Sent initial data (152 bytes) 15406 1726854937.92699: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854937.92754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.92833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.92854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.92982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854937.94716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854937.94821: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpnsjxq95q /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py <<< 15406 1726854937.94824: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py" <<< 15406 1726854937.95008: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpnsjxq95q" to remote "/root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py" <<< 15406 1726854937.97003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854937.97007: stdout chunk (state=3): >>><<< 15406 1726854937.97010: stderr chunk (state=3): >>><<< 15406 1726854937.97012: done transferring module to remote 15406 1726854937.97014: _low_level_execute_command(): starting 15406 1726854937.97016: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/ /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py && sleep 0' 15406 1726854937.98158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854937.98210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854937.98322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854937.98425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854937.98512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854937.98584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.00464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854938.00516: stdout chunk (state=3): >>><<< 15406 1726854938.00535: stderr chunk (state=3): >>><<< 15406 1726854938.00594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854938.00653: _low_level_execute_command(): starting 15406 1726854938.00722: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/AnsiballZ_stat.py && sleep 0' 15406 1726854938.02064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.02328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854938.02461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854938.02474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.02608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.17763: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15406 1726854938.19179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854938.19229: stdout chunk (state=3): >>><<< 15406 1726854938.19233: stderr chunk (state=3): >>><<< 15406 1726854938.19251: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854938.19372: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854938.19377: _low_level_execute_command(): starting 15406 1726854938.19379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854937.872104-15747-157588252447105/ > /dev/null 2>&1 && sleep 0' 15406 1726854938.19972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854938.19991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854938.20006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854938.20052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.20123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854938.20152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.20260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.22210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854938.22258: stderr chunk (state=3): >>><<< 15406 1726854938.22261: stdout chunk (state=3): >>><<< 15406 1726854938.22416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854938.22420: handler run complete 15406 1726854938.22422: attempt loop complete, returning result 15406 1726854938.22425: _execute() done 15406 1726854938.22427: dumping result to json 15406 1726854938.22429: done dumping result, returning 15406 1726854938.22431: done running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000133] 15406 1726854938.22437: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000133 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15406 1726854938.22899: no more pending results, returning what we have 15406 1726854938.22902: results queue empty 15406 1726854938.22902: checking for any_errors_fatal 15406 1726854938.22903: done checking for any_errors_fatal 15406 1726854938.22904: checking for max_fail_percentage 15406 1726854938.22905: done checking for max_fail_percentage 15406 1726854938.22905: checking to see if all hosts have failed and the running result is not ok 15406 1726854938.22906: done checking to see if all hosts have failed 15406 1726854938.22906: getting the remaining hosts for this loop 15406 1726854938.22907: done getting the remaining hosts for this loop 15406 1726854938.22909: getting the next task for host managed_node2 15406 1726854938.22913: done getting next task for host managed_node2 15406 1726854938.22915: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15406 1726854938.22916: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854938.22918: getting variables 15406 1726854938.22919: in VariableManager get_vars() 15406 1726854938.22936: Calling all_inventory to load vars for managed_node2 15406 1726854938.22938: Calling groups_inventory to load vars for managed_node2 15406 1726854938.22940: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854938.22948: Calling all_plugins_play to load vars for managed_node2 15406 1726854938.22949: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854938.22956: Calling groups_plugins_play to load vars for managed_node2 15406 1726854938.23051: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000133 15406 1726854938.23055: WORKER PROCESS EXITING 15406 1726854938.23067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854938.23177: done with get_vars() 15406 1726854938.23184: done getting variables 15406 1726854938.23251: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 15406 1726854938.23333: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:55:38 -0400 (0:00:00.447) 0:00:06.056 ****** 15406 1726854938.23353: entering _queue_task() for managed_node2/assert 15406 1726854938.23354: Creating lock for assert 15406 1726854938.23561: worker is 1 (out of 1 available) 15406 1726854938.23572: exiting _queue_task() for managed_node2/assert 15406 1726854938.23583: done queuing things up, now waiting for results queue to drain 15406 1726854938.23584: waiting for pending results... 15406 1726854938.23738: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15406 1726854938.23801: in run() - task 0affcc66-ac2b-3c83-32d3-00000000011a 15406 1726854938.23811: variable 'ansible_search_path' from source: unknown 15406 1726854938.23817: variable 'ansible_search_path' from source: unknown 15406 1726854938.23850: calling self._execute() 15406 1726854938.23910: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854938.23918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854938.23929: variable 'omit' from source: magic vars 15406 1726854938.24248: variable 'ansible_distribution_major_version' from source: facts 15406 1726854938.24251: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854938.24254: variable 'omit' from source: magic vars 15406 1726854938.24258: variable 'omit' from source: magic vars 15406 1726854938.24296: variable 'interface' from source: set_fact 15406 1726854938.24309: variable 'omit' from source: magic vars 15406 1726854938.24339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854938.24378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854938.24388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854938.24401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854938.24411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854938.24433: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854938.24436: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854938.24439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854938.24510: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854938.24513: Set connection var ansible_timeout to 10 15406 1726854938.24516: Set connection var ansible_connection to ssh 15406 1726854938.24522: Set connection var ansible_shell_type to sh 15406 1726854938.24527: Set connection var ansible_shell_executable to /bin/sh 15406 1726854938.24533: Set connection var ansible_pipelining to False 15406 1726854938.24551: variable 'ansible_shell_executable' from source: unknown 15406 1726854938.24554: variable 'ansible_connection' from source: unknown 15406 1726854938.24557: variable 'ansible_module_compression' from source: unknown 15406 1726854938.24559: variable 'ansible_shell_type' from source: unknown 15406 1726854938.24561: variable 'ansible_shell_executable' from source: unknown 15406 1726854938.24564: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854938.24568: variable 'ansible_pipelining' from source: unknown 15406 1726854938.24570: variable 'ansible_timeout' from source: unknown 15406 1726854938.24572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854938.24670: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854938.24678: variable 'omit' from source: magic vars 15406 1726854938.24692: starting attempt loop 15406 1726854938.24696: running the handler 15406 1726854938.24786: variable 'interface_stat' from source: set_fact 15406 1726854938.24793: Evaluated conditional (not interface_stat.stat.exists): True 15406 1726854938.24801: handler run complete 15406 1726854938.24815: attempt loop complete, returning result 15406 1726854938.24818: _execute() done 15406 1726854938.24821: dumping result to json 15406 1726854938.24823: done dumping result, returning 15406 1726854938.24830: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affcc66-ac2b-3c83-32d3-00000000011a] 15406 1726854938.24834: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000011a 15406 1726854938.24909: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000011a 15406 1726854938.24912: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854938.24957: no more pending results, returning what we have 15406 1726854938.24960: results queue empty 15406 1726854938.24961: checking for any_errors_fatal 15406 1726854938.24969: done checking for any_errors_fatal 15406 1726854938.24970: checking for max_fail_percentage 15406 1726854938.24972: done checking for max_fail_percentage 15406 1726854938.24972: checking to see if all hosts have failed and the running result is not ok 15406 1726854938.24973: done checking to see if all hosts have failed 15406 1726854938.24974: getting the remaining hosts for this loop 15406 1726854938.24975: done getting the remaining hosts for this loop 15406 1726854938.24978: getting the next task for host managed_node2 15406 1726854938.24989: done getting next task for host managed_node2 15406 1726854938.24991: ^ task is: TASK: meta (flush_handlers) 15406 1726854938.24992: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854938.24996: getting variables 15406 1726854938.24998: in VariableManager get_vars() 15406 1726854938.25026: Calling all_inventory to load vars for managed_node2 15406 1726854938.25028: Calling groups_inventory to load vars for managed_node2 15406 1726854938.25034: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854938.25043: Calling all_plugins_play to load vars for managed_node2 15406 1726854938.25046: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854938.25049: Calling groups_plugins_play to load vars for managed_node2 15406 1726854938.25244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854938.25462: done with get_vars() 15406 1726854938.25473: done getting variables 15406 1726854938.25544: in VariableManager get_vars() 15406 1726854938.25554: Calling all_inventory to load vars for managed_node2 15406 1726854938.25556: Calling groups_inventory to load vars for managed_node2 15406 1726854938.25558: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854938.25562: Calling all_plugins_play to load vars for managed_node2 15406 1726854938.25564: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854938.25567: Calling groups_plugins_play to load vars for managed_node2 15406 1726854938.25709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854938.25894: done with get_vars() 15406 1726854938.25906: done queuing things up, now waiting for results queue to drain 15406 1726854938.25908: results queue empty 15406 1726854938.25909: checking for any_errors_fatal 15406 1726854938.25911: done checking for any_errors_fatal 15406 1726854938.25912: checking for max_fail_percentage 15406 1726854938.25913: done checking for max_fail_percentage 15406 1726854938.25913: checking to see if all hosts have failed and the running result is not ok 15406 1726854938.25914: done checking to see if all hosts have failed 15406 1726854938.25919: getting the remaining hosts for this loop 15406 1726854938.25920: done getting the remaining hosts for this loop 15406 1726854938.25923: getting the next task for host managed_node2 15406 1726854938.25926: done getting next task for host managed_node2 15406 1726854938.25927: ^ task is: TASK: meta (flush_handlers) 15406 1726854938.25929: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854938.25931: getting variables 15406 1726854938.25932: in VariableManager get_vars() 15406 1726854938.25939: Calling all_inventory to load vars for managed_node2 15406 1726854938.25941: Calling groups_inventory to load vars for managed_node2 15406 1726854938.25943: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854938.25947: Calling all_plugins_play to load vars for managed_node2 15406 1726854938.25950: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854938.25953: Calling groups_plugins_play to load vars for managed_node2 15406 1726854938.26086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854938.26285: done with get_vars() 15406 1726854938.26294: done getting variables 15406 1726854938.26325: in VariableManager get_vars() 15406 1726854938.26331: Calling all_inventory to load vars for managed_node2 15406 1726854938.26332: Calling groups_inventory to load vars for managed_node2 15406 1726854938.26334: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854938.26336: Calling all_plugins_play to load vars for managed_node2 15406 1726854938.26338: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854938.26339: Calling groups_plugins_play to load vars for managed_node2 15406 1726854938.26420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854938.26526: done with get_vars() 15406 1726854938.26533: done queuing things up, now waiting for results queue to drain 15406 1726854938.26534: results queue empty 15406 1726854938.26535: checking for any_errors_fatal 15406 1726854938.26535: done checking for any_errors_fatal 15406 1726854938.26536: checking for max_fail_percentage 15406 1726854938.26536: done checking for max_fail_percentage 15406 1726854938.26537: checking to see if all hosts have failed and the running result is not ok 15406 1726854938.26537: done checking to see if all hosts have failed 15406 1726854938.26538: getting the remaining hosts for this loop 15406 1726854938.26538: done getting the remaining hosts for this loop 15406 1726854938.26540: getting the next task for host managed_node2 15406 1726854938.26541: done getting next task for host managed_node2 15406 1726854938.26542: ^ task is: None 15406 1726854938.26543: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854938.26543: done queuing things up, now waiting for results queue to drain 15406 1726854938.26544: results queue empty 15406 1726854938.26544: checking for any_errors_fatal 15406 1726854938.26545: done checking for any_errors_fatal 15406 1726854938.26545: checking for max_fail_percentage 15406 1726854938.26546: done checking for max_fail_percentage 15406 1726854938.26546: checking to see if all hosts have failed and the running result is not ok 15406 1726854938.26546: done checking to see if all hosts have failed 15406 1726854938.26548: getting the next task for host managed_node2 15406 1726854938.26549: done getting next task for host managed_node2 15406 1726854938.26550: ^ task is: None 15406 1726854938.26550: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854938.26586: in VariableManager get_vars() 15406 1726854938.26602: done with get_vars() 15406 1726854938.26605: in VariableManager get_vars() 15406 1726854938.26615: done with get_vars() 15406 1726854938.26619: variable 'omit' from source: magic vars 15406 1726854938.26640: in VariableManager get_vars() 15406 1726854938.26649: done with get_vars() 15406 1726854938.26661: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15406 1726854938.27028: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854938.27047: getting the remaining hosts for this loop 15406 1726854938.27048: done getting the remaining hosts for this loop 15406 1726854938.27050: getting the next task for host managed_node2 15406 1726854938.27053: done getting next task for host managed_node2 15406 1726854938.27054: ^ task is: TASK: Gathering Facts 15406 1726854938.27055: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854938.27056: getting variables 15406 1726854938.27056: in VariableManager get_vars() 15406 1726854938.27064: Calling all_inventory to load vars for managed_node2 15406 1726854938.27066: Calling groups_inventory to load vars for managed_node2 15406 1726854938.27067: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854938.27071: Calling all_plugins_play to load vars for managed_node2 15406 1726854938.27072: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854938.27074: Calling groups_plugins_play to load vars for managed_node2 15406 1726854938.27151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854938.27257: done with get_vars() 15406 1726854938.27263: done getting variables 15406 1726854938.27293: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Friday 20 September 2024 13:55:38 -0400 (0:00:00.039) 0:00:06.095 ****** 15406 1726854938.27310: entering _queue_task() for managed_node2/gather_facts 15406 1726854938.27490: worker is 1 (out of 1 available) 15406 1726854938.27503: exiting _queue_task() for managed_node2/gather_facts 15406 1726854938.27516: done queuing things up, now waiting for results queue to drain 15406 1726854938.27517: waiting for pending results... 15406 1726854938.27671: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854938.27734: in run() - task 0affcc66-ac2b-3c83-32d3-00000000014c 15406 1726854938.27745: variable 'ansible_search_path' from source: unknown 15406 1726854938.27772: calling self._execute() 15406 1726854938.27831: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854938.27836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854938.27842: variable 'omit' from source: magic vars 15406 1726854938.28101: variable 'ansible_distribution_major_version' from source: facts 15406 1726854938.28110: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854938.28115: variable 'omit' from source: magic vars 15406 1726854938.28132: variable 'omit' from source: magic vars 15406 1726854938.28158: variable 'omit' from source: magic vars 15406 1726854938.28192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854938.28218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854938.28233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854938.28245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854938.28256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854938.28289: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854938.28294: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854938.28296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854938.28358: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854938.28365: Set connection var ansible_timeout to 10 15406 1726854938.28368: Set connection var ansible_connection to ssh 15406 1726854938.28373: Set connection var ansible_shell_type to sh 15406 1726854938.28380: Set connection var ansible_shell_executable to /bin/sh 15406 1726854938.28395: Set connection var ansible_pipelining to False 15406 1726854938.28408: variable 'ansible_shell_executable' from source: unknown 15406 1726854938.28410: variable 'ansible_connection' from source: unknown 15406 1726854938.28413: variable 'ansible_module_compression' from source: unknown 15406 1726854938.28416: variable 'ansible_shell_type' from source: unknown 15406 1726854938.28419: variable 'ansible_shell_executable' from source: unknown 15406 1726854938.28421: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854938.28424: variable 'ansible_pipelining' from source: unknown 15406 1726854938.28427: variable 'ansible_timeout' from source: unknown 15406 1726854938.28431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854938.28558: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854938.28566: variable 'omit' from source: magic vars 15406 1726854938.28570: starting attempt loop 15406 1726854938.28573: running the handler 15406 1726854938.28592: variable 'ansible_facts' from source: unknown 15406 1726854938.28606: _low_level_execute_command(): starting 15406 1726854938.28612: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854938.29127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854938.29133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.29159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.29219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854938.29260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.29342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.31417: stdout chunk (state=3): >>>/root <<< 15406 1726854938.31422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854938.31424: stdout chunk (state=3): >>><<< 15406 1726854938.31426: stderr chunk (state=3): >>><<< 15406 1726854938.31430: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854938.31432: _low_level_execute_command(): starting 15406 1726854938.31434: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163 `" && echo ansible-tmp-1726854938.3133533-15779-219457269948163="` echo /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163 `" ) && sleep 0' 15406 1726854938.32760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854938.32775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854938.32802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.32842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854938.33014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.33104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.35066: stdout chunk (state=3): >>>ansible-tmp-1726854938.3133533-15779-219457269948163=/root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163 <<< 15406 1726854938.35162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854938.35408: stderr chunk (state=3): >>><<< 15406 1726854938.35411: stdout chunk (state=3): >>><<< 15406 1726854938.35415: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854938.3133533-15779-219457269948163=/root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854938.35417: variable 'ansible_module_compression' from source: unknown 15406 1726854938.35603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854938.35606: variable 'ansible_facts' from source: unknown 15406 1726854938.36141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py 15406 1726854938.36384: Sending initial data 15406 1726854938.36448: Sent initial data (154 bytes) 15406 1726854938.37034: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854938.37048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854938.37063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854938.37103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854938.37126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854938.37229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854938.37327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.37485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.39084: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15406 1726854938.39222: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854938.39306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854938.39371: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpewczk6sb /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py <<< 15406 1726854938.39374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py" <<< 15406 1726854938.39461: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpewczk6sb" to remote "/root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py" <<< 15406 1726854938.42356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854938.42438: stderr chunk (state=3): >>><<< 15406 1726854938.42441: stdout chunk (state=3): >>><<< 15406 1726854938.42546: done transferring module to remote 15406 1726854938.42550: _low_level_execute_command(): starting 15406 1726854938.42552: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/ /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py && sleep 0' 15406 1726854938.43719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854938.43732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15406 1726854938.43771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.43855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854938.44019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854938.44045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.44203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854938.45943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854938.46035: stderr chunk (state=3): >>><<< 15406 1726854938.46039: stdout chunk (state=3): >>><<< 15406 1726854938.46042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854938.46056: _low_level_execute_command(): starting 15406 1726854938.46327: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/AnsiballZ_setup.py && sleep 0' 15406 1726854938.47504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854938.47724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854938.47732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854938.47735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854938.47820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854939.11113: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansi<<< 15406 1726854939.11165: stdout chunk (state=3): >>>ble_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 722, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797195776, "block_size": 4096, "block_total": 65519099, "block_available": 63915331, "block_used": 1603768, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.5009765625, "5m": 0.37353515625, "15m": 0.18212890625}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "39", "epoch": "1726854939", "epoch_int": "1726854939", "date": "2024-09-20", "time": "13:55:39", "iso8601_micro": "2024-09-20T17:55:39.071820Z", "iso8601": "2024-09-20T17:55:39Z", "iso8601_basic": "20240920T135539071820", "iso8601_basic_short": "20240920T135539", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854939.13173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854939.13176: stdout chunk (state=3): >>><<< 15406 1726854939.13179: stderr chunk (state=3): >>><<< 15406 1726854939.13421: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 722, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797195776, "block_size": 4096, "block_total": 65519099, "block_available": 63915331, "block_used": 1603768, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.5009765625, "5m": 0.37353515625, "15m": 0.18212890625}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "39", "epoch": "1726854939", "epoch_int": "1726854939", "date": "2024-09-20", "time": "13:55:39", "iso8601_micro": "2024-09-20T17:55:39.071820Z", "iso8601": "2024-09-20T17:55:39Z", "iso8601_basic": "20240920T135539071820", "iso8601_basic_short": "20240920T135539", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854939.13864: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854939.13937: _low_level_execute_command(): starting 15406 1726854939.14003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854938.3133533-15779-219457269948163/ > /dev/null 2>&1 && sleep 0' 15406 1726854939.16096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.16316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854939.16357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854939.16453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854939.18364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854939.18376: stdout chunk (state=3): >>><<< 15406 1726854939.18395: stderr chunk (state=3): >>><<< 15406 1726854939.18414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854939.18426: handler run complete 15406 1726854939.18565: variable 'ansible_facts' from source: unknown 15406 1726854939.18799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.19002: variable 'ansible_facts' from source: unknown 15406 1726854939.19101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.19261: attempt loop complete, returning result 15406 1726854939.19270: _execute() done 15406 1726854939.19277: dumping result to json 15406 1726854939.19315: done dumping result, returning 15406 1726854939.19326: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-00000000014c] 15406 1726854939.19343: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000014c 15406 1726854939.19794: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000014c ok: [managed_node2] 15406 1726854939.20201: no more pending results, returning what we have 15406 1726854939.20204: results queue empty 15406 1726854939.20205: checking for any_errors_fatal 15406 1726854939.20206: done checking for any_errors_fatal 15406 1726854939.20207: checking for max_fail_percentage 15406 1726854939.20208: done checking for max_fail_percentage 15406 1726854939.20209: checking to see if all hosts have failed and the running result is not ok 15406 1726854939.20210: done checking to see if all hosts have failed 15406 1726854939.20211: getting the remaining hosts for this loop 15406 1726854939.20212: done getting the remaining hosts for this loop 15406 1726854939.20291: getting the next task for host managed_node2 15406 1726854939.20297: done getting next task for host managed_node2 15406 1726854939.20298: ^ task is: TASK: meta (flush_handlers) 15406 1726854939.20300: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854939.20303: getting variables 15406 1726854939.20305: in VariableManager get_vars() 15406 1726854939.20351: Calling all_inventory to load vars for managed_node2 15406 1726854939.20354: Calling groups_inventory to load vars for managed_node2 15406 1726854939.20356: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.20390: WORKER PROCESS EXITING 15406 1726854939.20400: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.20403: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.20406: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.20591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.20804: done with get_vars() 15406 1726854939.20814: done getting variables 15406 1726854939.20888: in VariableManager get_vars() 15406 1726854939.20899: Calling all_inventory to load vars for managed_node2 15406 1726854939.20901: Calling groups_inventory to load vars for managed_node2 15406 1726854939.20904: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.20908: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.20910: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.20912: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.21056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.21266: done with get_vars() 15406 1726854939.21278: done queuing things up, now waiting for results queue to drain 15406 1726854939.21280: results queue empty 15406 1726854939.21280: checking for any_errors_fatal 15406 1726854939.21286: done checking for any_errors_fatal 15406 1726854939.21288: checking for max_fail_percentage 15406 1726854939.21289: done checking for max_fail_percentage 15406 1726854939.21290: checking to see if all hosts have failed and the running result is not ok 15406 1726854939.21291: done checking to see if all hosts have failed 15406 1726854939.21295: getting the remaining hosts for this loop 15406 1726854939.21296: done getting the remaining hosts for this loop 15406 1726854939.21298: getting the next task for host managed_node2 15406 1726854939.21305: done getting next task for host managed_node2 15406 1726854939.21308: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15406 1726854939.21310: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854939.21323: getting variables 15406 1726854939.21324: in VariableManager get_vars() 15406 1726854939.21336: Calling all_inventory to load vars for managed_node2 15406 1726854939.21338: Calling groups_inventory to load vars for managed_node2 15406 1726854939.21340: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.21344: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.21346: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.21348: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.21496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.21731: done with get_vars() 15406 1726854939.21747: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:55:39 -0400 (0:00:00.945) 0:00:07.040 ****** 15406 1726854939.21818: entering _queue_task() for managed_node2/include_tasks 15406 1726854939.22144: worker is 1 (out of 1 available) 15406 1726854939.22159: exiting _queue_task() for managed_node2/include_tasks 15406 1726854939.22292: done queuing things up, now waiting for results queue to drain 15406 1726854939.22294: waiting for pending results... 15406 1726854939.22458: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15406 1726854939.22586: in run() - task 0affcc66-ac2b-3c83-32d3-000000000014 15406 1726854939.22609: variable 'ansible_search_path' from source: unknown 15406 1726854939.22627: variable 'ansible_search_path' from source: unknown 15406 1726854939.22667: calling self._execute() 15406 1726854939.22765: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.22776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.22796: variable 'omit' from source: magic vars 15406 1726854939.23185: variable 'ansible_distribution_major_version' from source: facts 15406 1726854939.23203: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854939.23213: _execute() done 15406 1726854939.23220: dumping result to json 15406 1726854939.23227: done dumping result, returning 15406 1726854939.23237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-3c83-32d3-000000000014] 15406 1726854939.23245: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000014 15406 1726854939.23421: no more pending results, returning what we have 15406 1726854939.23426: in VariableManager get_vars() 15406 1726854939.23467: Calling all_inventory to load vars for managed_node2 15406 1726854939.23469: Calling groups_inventory to load vars for managed_node2 15406 1726854939.23473: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.23490: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.23493: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.23496: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.23881: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000014 15406 1726854939.23889: WORKER PROCESS EXITING 15406 1726854939.23923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.24103: done with get_vars() 15406 1726854939.24110: variable 'ansible_search_path' from source: unknown 15406 1726854939.24111: variable 'ansible_search_path' from source: unknown 15406 1726854939.24144: we have included files to process 15406 1726854939.24146: generating all_blocks data 15406 1726854939.24147: done generating all_blocks data 15406 1726854939.24148: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854939.24149: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854939.24151: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854939.24855: done processing included file 15406 1726854939.24857: iterating over new_blocks loaded from include file 15406 1726854939.24859: in VariableManager get_vars() 15406 1726854939.24877: done with get_vars() 15406 1726854939.24879: filtering new block on tags 15406 1726854939.24906: done filtering new block on tags 15406 1726854939.24909: in VariableManager get_vars() 15406 1726854939.24928: done with get_vars() 15406 1726854939.24929: filtering new block on tags 15406 1726854939.24947: done filtering new block on tags 15406 1726854939.24950: in VariableManager get_vars() 15406 1726854939.24968: done with get_vars() 15406 1726854939.24969: filtering new block on tags 15406 1726854939.24986: done filtering new block on tags 15406 1726854939.24991: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 15406 1726854939.25000: extending task lists for all hosts with included blocks 15406 1726854939.25411: done extending task lists 15406 1726854939.25413: done processing included files 15406 1726854939.25414: results queue empty 15406 1726854939.25415: checking for any_errors_fatal 15406 1726854939.25416: done checking for any_errors_fatal 15406 1726854939.25417: checking for max_fail_percentage 15406 1726854939.25418: done checking for max_fail_percentage 15406 1726854939.25419: checking to see if all hosts have failed and the running result is not ok 15406 1726854939.25419: done checking to see if all hosts have failed 15406 1726854939.25420: getting the remaining hosts for this loop 15406 1726854939.25422: done getting the remaining hosts for this loop 15406 1726854939.25424: getting the next task for host managed_node2 15406 1726854939.25428: done getting next task for host managed_node2 15406 1726854939.25438: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15406 1726854939.25440: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854939.25449: getting variables 15406 1726854939.25450: in VariableManager get_vars() 15406 1726854939.25463: Calling all_inventory to load vars for managed_node2 15406 1726854939.25465: Calling groups_inventory to load vars for managed_node2 15406 1726854939.25467: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.25473: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.25475: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.25478: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.25662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.25879: done with get_vars() 15406 1726854939.25893: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:55:39 -0400 (0:00:00.041) 0:00:07.082 ****** 15406 1726854939.25961: entering _queue_task() for managed_node2/setup 15406 1726854939.26423: worker is 1 (out of 1 available) 15406 1726854939.26435: exiting _queue_task() for managed_node2/setup 15406 1726854939.26447: done queuing things up, now waiting for results queue to drain 15406 1726854939.26448: waiting for pending results... 15406 1726854939.26622: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15406 1726854939.26762: in run() - task 0affcc66-ac2b-3c83-32d3-00000000018d 15406 1726854939.26790: variable 'ansible_search_path' from source: unknown 15406 1726854939.26799: variable 'ansible_search_path' from source: unknown 15406 1726854939.26836: calling self._execute() 15406 1726854939.26927: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.26938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.26952: variable 'omit' from source: magic vars 15406 1726854939.27338: variable 'ansible_distribution_major_version' from source: facts 15406 1726854939.27355: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854939.27574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854939.29795: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854939.29871: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854939.29925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854939.29964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854939.30026: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854939.30085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854939.30123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854939.30194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854939.30209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854939.30227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854939.30292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854939.30325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854939.30360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854939.30408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854939.30460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854939.30611: variable '__network_required_facts' from source: role '' defaults 15406 1726854939.30626: variable 'ansible_facts' from source: unknown 15406 1726854939.30730: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15406 1726854939.30790: when evaluation is False, skipping this task 15406 1726854939.30793: _execute() done 15406 1726854939.30795: dumping result to json 15406 1726854939.30797: done dumping result, returning 15406 1726854939.30799: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-3c83-32d3-00000000018d] 15406 1726854939.30801: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000018d 15406 1726854939.31041: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000018d 15406 1726854939.31044: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854939.31094: no more pending results, returning what we have 15406 1726854939.31098: results queue empty 15406 1726854939.31099: checking for any_errors_fatal 15406 1726854939.31101: done checking for any_errors_fatal 15406 1726854939.31102: checking for max_fail_percentage 15406 1726854939.31103: done checking for max_fail_percentage 15406 1726854939.31104: checking to see if all hosts have failed and the running result is not ok 15406 1726854939.31105: done checking to see if all hosts have failed 15406 1726854939.31106: getting the remaining hosts for this loop 15406 1726854939.31107: done getting the remaining hosts for this loop 15406 1726854939.31111: getting the next task for host managed_node2 15406 1726854939.31121: done getting next task for host managed_node2 15406 1726854939.31125: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15406 1726854939.31128: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854939.31141: getting variables 15406 1726854939.31143: in VariableManager get_vars() 15406 1726854939.31185: Calling all_inventory to load vars for managed_node2 15406 1726854939.31190: Calling groups_inventory to load vars for managed_node2 15406 1726854939.31193: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.31204: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.31207: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.31210: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.31572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.31790: done with get_vars() 15406 1726854939.31801: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:55:39 -0400 (0:00:00.059) 0:00:07.141 ****** 15406 1726854939.31903: entering _queue_task() for managed_node2/stat 15406 1726854939.32285: worker is 1 (out of 1 available) 15406 1726854939.32298: exiting _queue_task() for managed_node2/stat 15406 1726854939.32308: done queuing things up, now waiting for results queue to drain 15406 1726854939.32309: waiting for pending results... 15406 1726854939.32462: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15406 1726854939.32572: in run() - task 0affcc66-ac2b-3c83-32d3-00000000018f 15406 1726854939.32604: variable 'ansible_search_path' from source: unknown 15406 1726854939.32606: variable 'ansible_search_path' from source: unknown 15406 1726854939.32713: calling self._execute() 15406 1726854939.32722: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.32731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.32743: variable 'omit' from source: magic vars 15406 1726854939.33109: variable 'ansible_distribution_major_version' from source: facts 15406 1726854939.33127: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854939.33309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854939.33672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854939.33739: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854939.33780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854939.33828: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854939.33930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854939.33992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854939.34000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854939.34040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854939.34140: variable '__network_is_ostree' from source: set_fact 15406 1726854939.34152: Evaluated conditional (not __network_is_ostree is defined): False 15406 1726854939.34192: when evaluation is False, skipping this task 15406 1726854939.34195: _execute() done 15406 1726854939.34198: dumping result to json 15406 1726854939.34200: done dumping result, returning 15406 1726854939.34203: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-3c83-32d3-00000000018f] 15406 1726854939.34205: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000018f 15406 1726854939.34305: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000018f skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15406 1726854939.34504: no more pending results, returning what we have 15406 1726854939.34507: results queue empty 15406 1726854939.34508: checking for any_errors_fatal 15406 1726854939.34512: done checking for any_errors_fatal 15406 1726854939.34513: checking for max_fail_percentage 15406 1726854939.34514: done checking for max_fail_percentage 15406 1726854939.34515: checking to see if all hosts have failed and the running result is not ok 15406 1726854939.34516: done checking to see if all hosts have failed 15406 1726854939.34517: getting the remaining hosts for this loop 15406 1726854939.34518: done getting the remaining hosts for this loop 15406 1726854939.34521: getting the next task for host managed_node2 15406 1726854939.34526: done getting next task for host managed_node2 15406 1726854939.34529: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15406 1726854939.34532: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854939.34544: getting variables 15406 1726854939.34546: in VariableManager get_vars() 15406 1726854939.34594: Calling all_inventory to load vars for managed_node2 15406 1726854939.34597: Calling groups_inventory to load vars for managed_node2 15406 1726854939.34599: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.34679: WORKER PROCESS EXITING 15406 1726854939.34695: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.34698: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.34701: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.34946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.35165: done with get_vars() 15406 1726854939.35174: done getting variables 15406 1726854939.35243: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:55:39 -0400 (0:00:00.033) 0:00:07.175 ****** 15406 1726854939.35273: entering _queue_task() for managed_node2/set_fact 15406 1726854939.35463: worker is 1 (out of 1 available) 15406 1726854939.35478: exiting _queue_task() for managed_node2/set_fact 15406 1726854939.35500: done queuing things up, now waiting for results queue to drain 15406 1726854939.35502: waiting for pending results... 15406 1726854939.35644: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15406 1726854939.35715: in run() - task 0affcc66-ac2b-3c83-32d3-000000000190 15406 1726854939.35726: variable 'ansible_search_path' from source: unknown 15406 1726854939.35730: variable 'ansible_search_path' from source: unknown 15406 1726854939.35753: calling self._execute() 15406 1726854939.35809: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.35814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.35822: variable 'omit' from source: magic vars 15406 1726854939.36063: variable 'ansible_distribution_major_version' from source: facts 15406 1726854939.36072: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854939.36394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854939.36501: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854939.36557: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854939.36595: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854939.36644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854939.36745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854939.36777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854939.36812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854939.36855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854939.36953: variable '__network_is_ostree' from source: set_fact 15406 1726854939.36967: Evaluated conditional (not __network_is_ostree is defined): False 15406 1726854939.36976: when evaluation is False, skipping this task 15406 1726854939.36984: _execute() done 15406 1726854939.36995: dumping result to json 15406 1726854939.37005: done dumping result, returning 15406 1726854939.37017: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-3c83-32d3-000000000190] 15406 1726854939.37062: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000190 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15406 1726854939.37213: no more pending results, returning what we have 15406 1726854939.37221: results queue empty 15406 1726854939.37223: checking for any_errors_fatal 15406 1726854939.37228: done checking for any_errors_fatal 15406 1726854939.37228: checking for max_fail_percentage 15406 1726854939.37230: done checking for max_fail_percentage 15406 1726854939.37232: checking to see if all hosts have failed and the running result is not ok 15406 1726854939.37233: done checking to see if all hosts have failed 15406 1726854939.37234: getting the remaining hosts for this loop 15406 1726854939.37235: done getting the remaining hosts for this loop 15406 1726854939.37238: getting the next task for host managed_node2 15406 1726854939.37248: done getting next task for host managed_node2 15406 1726854939.37251: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15406 1726854939.37255: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854939.37266: getting variables 15406 1726854939.37268: in VariableManager get_vars() 15406 1726854939.37418: Calling all_inventory to load vars for managed_node2 15406 1726854939.37421: Calling groups_inventory to load vars for managed_node2 15406 1726854939.37423: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854939.37431: Calling all_plugins_play to load vars for managed_node2 15406 1726854939.37434: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854939.37437: Calling groups_plugins_play to load vars for managed_node2 15406 1726854939.37697: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000190 15406 1726854939.37701: WORKER PROCESS EXITING 15406 1726854939.37731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854939.37943: done with get_vars() 15406 1726854939.37952: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:55:39 -0400 (0:00:00.027) 0:00:07.203 ****** 15406 1726854939.38046: entering _queue_task() for managed_node2/service_facts 15406 1726854939.38048: Creating lock for service_facts 15406 1726854939.38294: worker is 1 (out of 1 available) 15406 1726854939.38307: exiting _queue_task() for managed_node2/service_facts 15406 1726854939.38320: done queuing things up, now waiting for results queue to drain 15406 1726854939.38322: waiting for pending results... 15406 1726854939.38542: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 15406 1726854939.38828: in run() - task 0affcc66-ac2b-3c83-32d3-000000000192 15406 1726854939.38832: variable 'ansible_search_path' from source: unknown 15406 1726854939.38836: variable 'ansible_search_path' from source: unknown 15406 1726854939.38838: calling self._execute() 15406 1726854939.38870: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.38874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.38890: variable 'omit' from source: magic vars 15406 1726854939.39645: variable 'ansible_distribution_major_version' from source: facts 15406 1726854939.39663: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854939.39666: variable 'omit' from source: magic vars 15406 1726854939.39741: variable 'omit' from source: magic vars 15406 1726854939.39781: variable 'omit' from source: magic vars 15406 1726854939.40366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854939.40370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854939.40373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854939.40375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854939.40383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854939.40418: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854939.40422: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.40424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.40733: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854939.40740: Set connection var ansible_timeout to 10 15406 1726854939.40743: Set connection var ansible_connection to ssh 15406 1726854939.40748: Set connection var ansible_shell_type to sh 15406 1726854939.40794: Set connection var ansible_shell_executable to /bin/sh 15406 1726854939.40802: Set connection var ansible_pipelining to False 15406 1726854939.40913: variable 'ansible_shell_executable' from source: unknown 15406 1726854939.40917: variable 'ansible_connection' from source: unknown 15406 1726854939.40920: variable 'ansible_module_compression' from source: unknown 15406 1726854939.40922: variable 'ansible_shell_type' from source: unknown 15406 1726854939.40924: variable 'ansible_shell_executable' from source: unknown 15406 1726854939.40926: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854939.40928: variable 'ansible_pipelining' from source: unknown 15406 1726854939.40930: variable 'ansible_timeout' from source: unknown 15406 1726854939.40933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854939.42098: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854939.42114: variable 'omit' from source: magic vars 15406 1726854939.42141: starting attempt loop 15406 1726854939.42145: running the handler 15406 1726854939.42147: _low_level_execute_command(): starting 15406 1726854939.42229: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854939.44449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.44483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.44636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854939.44680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854939.44817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854939.44936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854939.46995: stdout chunk (state=3): >>>/root <<< 15406 1726854939.46999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854939.47001: stdout chunk (state=3): >>><<< 15406 1726854939.47003: stderr chunk (state=3): >>><<< 15406 1726854939.47007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854939.47008: _low_level_execute_command(): starting 15406 1726854939.47011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681 `" && echo ansible-tmp-1726854939.4688077-15835-111839729735681="` echo /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681 `" ) && sleep 0' 15406 1726854939.48505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854939.48755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854939.48759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854939.48819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854939.50733: stdout chunk (state=3): >>>ansible-tmp-1726854939.4688077-15835-111839729735681=/root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681 <<< 15406 1726854939.50835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854939.50882: stderr chunk (state=3): >>><<< 15406 1726854939.50890: stdout chunk (state=3): >>><<< 15406 1726854939.50959: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854939.4688077-15835-111839729735681=/root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854939.50963: variable 'ansible_module_compression' from source: unknown 15406 1726854939.51106: ANSIBALLZ: Using lock for service_facts 15406 1726854939.51109: ANSIBALLZ: Acquiring lock 15406 1726854939.51112: ANSIBALLZ: Lock acquired: 140626831000736 15406 1726854939.51114: ANSIBALLZ: Creating module 15406 1726854939.72378: ANSIBALLZ: Writing module into payload 15406 1726854939.72471: ANSIBALLZ: Writing module 15406 1726854939.72495: ANSIBALLZ: Renaming module 15406 1726854939.72509: ANSIBALLZ: Done creating module 15406 1726854939.72524: variable 'ansible_facts' from source: unknown 15406 1726854939.72705: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py 15406 1726854939.72815: Sending initial data 15406 1726854939.72819: Sent initial data (162 bytes) 15406 1726854939.74019: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854939.74137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.74449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854939.74613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854939.76271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854939.76327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854939.76423: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp0aih2hp0 /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py <<< 15406 1726854939.76427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py" <<< 15406 1726854939.76533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp0aih2hp0" to remote "/root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py" <<< 15406 1726854939.77691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854939.77757: stderr chunk (state=3): >>><<< 15406 1726854939.77770: stdout chunk (state=3): >>><<< 15406 1726854939.77992: done transferring module to remote 15406 1726854939.77995: _low_level_execute_command(): starting 15406 1726854939.77998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/ /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py && sleep 0' 15406 1726854939.78459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854939.78501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854939.78508: stderr chunk (state=3): >>>debug2: match found <<< 15406 1726854939.78519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.78614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854939.78621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854939.78716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854939.80634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854939.80637: stderr chunk (state=3): >>><<< 15406 1726854939.80646: stdout chunk (state=3): >>><<< 15406 1726854939.80679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854939.80682: _low_level_execute_command(): starting 15406 1726854939.80691: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/AnsiballZ_service_facts.py && sleep 0' 15406 1726854939.81290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854939.81294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854939.81296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854939.81299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854939.81301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854939.81303: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854939.81310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.81324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854939.81331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 15406 1726854939.81399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15406 1726854939.81402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854939.81404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854939.81406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854939.81408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854939.81410: stderr chunk (state=3): >>>debug2: match found <<< 15406 1726854939.81412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854939.81448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854939.81460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854939.81485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854939.81564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854941.34404: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 15406 1726854941.34525: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15406 1726854941.36371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854941.36375: stdout chunk (state=3): >>><<< 15406 1726854941.36378: stderr chunk (state=3): >>><<< 15406 1726854941.36382: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854941.37701: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854941.37728: _low_level_execute_command(): starting 15406 1726854941.37740: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854939.4688077-15835-111839729735681/ > /dev/null 2>&1 && sleep 0' 15406 1726854941.38358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854941.38374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854941.38392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854941.38413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854941.38431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854941.38448: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854941.38551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854941.38573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854941.38682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854941.40643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854941.40770: stdout chunk (state=3): >>><<< 15406 1726854941.40773: stderr chunk (state=3): >>><<< 15406 1726854941.40960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854941.41125: handler run complete 15406 1726854941.41748: variable 'ansible_facts' from source: unknown 15406 1726854941.42279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854941.43162: variable 'ansible_facts' from source: unknown 15406 1726854941.43339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854941.43599: attempt loop complete, returning result 15406 1726854941.43609: _execute() done 15406 1726854941.43616: dumping result to json 15406 1726854941.43693: done dumping result, returning 15406 1726854941.43708: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-3c83-32d3-000000000192] 15406 1726854941.43718: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000192 15406 1726854941.46463: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000192 15406 1726854941.46467: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854941.46579: no more pending results, returning what we have 15406 1726854941.46581: results queue empty 15406 1726854941.46583: checking for any_errors_fatal 15406 1726854941.46590: done checking for any_errors_fatal 15406 1726854941.46591: checking for max_fail_percentage 15406 1726854941.46593: done checking for max_fail_percentage 15406 1726854941.46593: checking to see if all hosts have failed and the running result is not ok 15406 1726854941.46594: done checking to see if all hosts have failed 15406 1726854941.46595: getting the remaining hosts for this loop 15406 1726854941.46596: done getting the remaining hosts for this loop 15406 1726854941.46599: getting the next task for host managed_node2 15406 1726854941.46604: done getting next task for host managed_node2 15406 1726854941.46607: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15406 1726854941.46610: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854941.46618: getting variables 15406 1726854941.46620: in VariableManager get_vars() 15406 1726854941.46657: Calling all_inventory to load vars for managed_node2 15406 1726854941.46736: Calling groups_inventory to load vars for managed_node2 15406 1726854941.46740: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854941.46749: Calling all_plugins_play to load vars for managed_node2 15406 1726854941.46752: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854941.46755: Calling groups_plugins_play to load vars for managed_node2 15406 1726854941.48316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854941.49176: done with get_vars() 15406 1726854941.49192: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:55:41 -0400 (0:00:02.112) 0:00:09.315 ****** 15406 1726854941.49282: entering _queue_task() for managed_node2/package_facts 15406 1726854941.49284: Creating lock for package_facts 15406 1726854941.50038: worker is 1 (out of 1 available) 15406 1726854941.50050: exiting _queue_task() for managed_node2/package_facts 15406 1726854941.50063: done queuing things up, now waiting for results queue to drain 15406 1726854941.50064: waiting for pending results... 15406 1726854941.50892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15406 1726854941.51364: in run() - task 0affcc66-ac2b-3c83-32d3-000000000193 15406 1726854941.51440: variable 'ansible_search_path' from source: unknown 15406 1726854941.51445: variable 'ansible_search_path' from source: unknown 15406 1726854941.51596: calling self._execute() 15406 1726854941.51723: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854941.51781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854941.51990: variable 'omit' from source: magic vars 15406 1726854941.52956: variable 'ansible_distribution_major_version' from source: facts 15406 1726854941.53013: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854941.53024: variable 'omit' from source: magic vars 15406 1726854941.53190: variable 'omit' from source: magic vars 15406 1726854941.53283: variable 'omit' from source: magic vars 15406 1726854941.53477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854941.53515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854941.53601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854941.53897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854941.53901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854941.53904: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854941.53906: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854941.53908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854941.54192: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854941.54196: Set connection var ansible_timeout to 10 15406 1726854941.54200: Set connection var ansible_connection to ssh 15406 1726854941.54358: Set connection var ansible_shell_type to sh 15406 1726854941.54370: Set connection var ansible_shell_executable to /bin/sh 15406 1726854941.54417: Set connection var ansible_pipelining to False 15406 1726854941.54491: variable 'ansible_shell_executable' from source: unknown 15406 1726854941.54728: variable 'ansible_connection' from source: unknown 15406 1726854941.54732: variable 'ansible_module_compression' from source: unknown 15406 1726854941.54734: variable 'ansible_shell_type' from source: unknown 15406 1726854941.54736: variable 'ansible_shell_executable' from source: unknown 15406 1726854941.54738: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854941.54741: variable 'ansible_pipelining' from source: unknown 15406 1726854941.54743: variable 'ansible_timeout' from source: unknown 15406 1726854941.54745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854941.55594: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854941.55599: variable 'omit' from source: magic vars 15406 1726854941.55601: starting attempt loop 15406 1726854941.55604: running the handler 15406 1726854941.55606: _low_level_execute_command(): starting 15406 1726854941.55608: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854941.56510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854941.56576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854941.56605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854941.56703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854941.58550: stdout chunk (state=3): >>>/root <<< 15406 1726854941.58619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854941.58895: stdout chunk (state=3): >>><<< 15406 1726854941.58899: stderr chunk (state=3): >>><<< 15406 1726854941.58902: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854941.58906: _low_level_execute_command(): starting 15406 1726854941.58909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172 `" && echo ansible-tmp-1726854941.5883791-15953-93298743275172="` echo /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172 `" ) && sleep 0' 15406 1726854941.60519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854941.60807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854941.60910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854941.62819: stdout chunk (state=3): >>>ansible-tmp-1726854941.5883791-15953-93298743275172=/root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172 <<< 15406 1726854941.62973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854941.62983: stdout chunk (state=3): >>><<< 15406 1726854941.62999: stderr chunk (state=3): >>><<< 15406 1726854941.63035: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854941.5883791-15953-93298743275172=/root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854941.63447: variable 'ansible_module_compression' from source: unknown 15406 1726854941.63521: ANSIBALLZ: Using lock for package_facts 15406 1726854941.63685: ANSIBALLZ: Acquiring lock 15406 1726854941.63714: ANSIBALLZ: Lock acquired: 140626831918528 15406 1726854941.63738: ANSIBALLZ: Creating module 15406 1726854942.34533: ANSIBALLZ: Writing module into payload 15406 1726854942.34984: ANSIBALLZ: Writing module 15406 1726854942.35036: ANSIBALLZ: Renaming module 15406 1726854942.35063: ANSIBALLZ: Done creating module 15406 1726854942.35381: variable 'ansible_facts' from source: unknown 15406 1726854942.35695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py 15406 1726854942.36023: Sending initial data 15406 1726854942.36026: Sent initial data (161 bytes) 15406 1726854942.36913: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854942.37003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854942.37007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854942.37089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854942.38842: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854942.38884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854942.38974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp_lpw9ikg /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py <<< 15406 1726854942.38978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py" <<< 15406 1726854942.39028: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp_lpw9ikg" to remote "/root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py" <<< 15406 1726854942.42518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854942.42590: stderr chunk (state=3): >>><<< 15406 1726854942.42609: stdout chunk (state=3): >>><<< 15406 1726854942.42643: done transferring module to remote 15406 1726854942.42772: _low_level_execute_command(): starting 15406 1726854942.42776: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/ /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py && sleep 0' 15406 1726854942.43697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854942.43849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854942.44013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854942.44407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854942.46097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854942.46101: stdout chunk (state=3): >>><<< 15406 1726854942.46108: stderr chunk (state=3): >>><<< 15406 1726854942.46135: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854942.46138: _low_level_execute_command(): starting 15406 1726854942.46141: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/AnsiballZ_package_facts.py && sleep 0' 15406 1726854942.47573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854942.47577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854942.47667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854942.47672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854942.47675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854942.47678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854942.47680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854942.47758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854942.47858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854942.91900: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15406 1726854942.91918: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 15406 1726854942.91943: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 15406 1726854942.91952: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 15406 1726854942.91959: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 15406 1726854942.91966: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 15406 1726854942.91971: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15406 1726854942.91979: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 15406 1726854942.92026: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 15406 1726854942.92029: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15406 1726854942.93994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854942.93998: stdout chunk (state=3): >>><<< 15406 1726854942.94000: stderr chunk (state=3): >>><<< 15406 1726854942.94068: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854942.96684: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854942.96705: _low_level_execute_command(): starting 15406 1726854942.96709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854941.5883791-15953-93298743275172/ > /dev/null 2>&1 && sleep 0' 15406 1726854942.97154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854942.97158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854942.97161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854942.97163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854942.97165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854942.97223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854942.97226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854942.97295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854942.99297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854942.99300: stdout chunk (state=3): >>><<< 15406 1726854942.99303: stderr chunk (state=3): >>><<< 15406 1726854942.99305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854942.99307: handler run complete 15406 1726854942.99900: variable 'ansible_facts' from source: unknown 15406 1726854943.00153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.01251: variable 'ansible_facts' from source: unknown 15406 1726854943.01713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.02309: attempt loop complete, returning result 15406 1726854943.02319: _execute() done 15406 1726854943.02322: dumping result to json 15406 1726854943.02468: done dumping result, returning 15406 1726854943.02479: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-3c83-32d3-000000000193] 15406 1726854943.02481: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000193 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854943.03760: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000193 15406 1726854943.03764: WORKER PROCESS EXITING 15406 1726854943.03769: no more pending results, returning what we have 15406 1726854943.03771: results queue empty 15406 1726854943.03772: checking for any_errors_fatal 15406 1726854943.03777: done checking for any_errors_fatal 15406 1726854943.03777: checking for max_fail_percentage 15406 1726854943.03778: done checking for max_fail_percentage 15406 1726854943.03779: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.03780: done checking to see if all hosts have failed 15406 1726854943.03780: getting the remaining hosts for this loop 15406 1726854943.03781: done getting the remaining hosts for this loop 15406 1726854943.03783: getting the next task for host managed_node2 15406 1726854943.03791: done getting next task for host managed_node2 15406 1726854943.03794: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15406 1726854943.03795: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.03801: getting variables 15406 1726854943.03802: in VariableManager get_vars() 15406 1726854943.03822: Calling all_inventory to load vars for managed_node2 15406 1726854943.03825: Calling groups_inventory to load vars for managed_node2 15406 1726854943.03827: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.03835: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.03836: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.03838: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.04627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.06084: done with get_vars() 15406 1726854943.06114: done getting variables 15406 1726854943.06166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:55:43 -0400 (0:00:01.569) 0:00:10.884 ****** 15406 1726854943.06205: entering _queue_task() for managed_node2/debug 15406 1726854943.06531: worker is 1 (out of 1 available) 15406 1726854943.06545: exiting _queue_task() for managed_node2/debug 15406 1726854943.06558: done queuing things up, now waiting for results queue to drain 15406 1726854943.06559: waiting for pending results... 15406 1726854943.07003: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 15406 1726854943.07010: in run() - task 0affcc66-ac2b-3c83-32d3-000000000015 15406 1726854943.07016: variable 'ansible_search_path' from source: unknown 15406 1726854943.07030: variable 'ansible_search_path' from source: unknown 15406 1726854943.07077: calling self._execute() 15406 1726854943.07195: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.07218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.07221: variable 'omit' from source: magic vars 15406 1726854943.07603: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.07616: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.07623: variable 'omit' from source: magic vars 15406 1726854943.07649: variable 'omit' from source: magic vars 15406 1726854943.07733: variable 'network_provider' from source: set_fact 15406 1726854943.07744: variable 'omit' from source: magic vars 15406 1726854943.07776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854943.07814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854943.07992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854943.07996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854943.07999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854943.08001: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854943.08005: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.08008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.08040: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854943.08051: Set connection var ansible_timeout to 10 15406 1726854943.08054: Set connection var ansible_connection to ssh 15406 1726854943.08132: Set connection var ansible_shell_type to sh 15406 1726854943.08146: Set connection var ansible_shell_executable to /bin/sh 15406 1726854943.08150: Set connection var ansible_pipelining to False 15406 1726854943.08152: variable 'ansible_shell_executable' from source: unknown 15406 1726854943.08155: variable 'ansible_connection' from source: unknown 15406 1726854943.08157: variable 'ansible_module_compression' from source: unknown 15406 1726854943.08160: variable 'ansible_shell_type' from source: unknown 15406 1726854943.08162: variable 'ansible_shell_executable' from source: unknown 15406 1726854943.08164: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.08166: variable 'ansible_pipelining' from source: unknown 15406 1726854943.08168: variable 'ansible_timeout' from source: unknown 15406 1726854943.08170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.08338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854943.08342: variable 'omit' from source: magic vars 15406 1726854943.08344: starting attempt loop 15406 1726854943.08346: running the handler 15406 1726854943.08380: handler run complete 15406 1726854943.08389: attempt loop complete, returning result 15406 1726854943.08392: _execute() done 15406 1726854943.08396: dumping result to json 15406 1726854943.08399: done dumping result, returning 15406 1726854943.08489: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-3c83-32d3-000000000015] 15406 1726854943.08493: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000015 15406 1726854943.08561: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000015 15406 1726854943.08564: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 15406 1726854943.08641: no more pending results, returning what we have 15406 1726854943.08644: results queue empty 15406 1726854943.08645: checking for any_errors_fatal 15406 1726854943.08651: done checking for any_errors_fatal 15406 1726854943.08652: checking for max_fail_percentage 15406 1726854943.08653: done checking for max_fail_percentage 15406 1726854943.08654: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.08655: done checking to see if all hosts have failed 15406 1726854943.08656: getting the remaining hosts for this loop 15406 1726854943.08657: done getting the remaining hosts for this loop 15406 1726854943.08660: getting the next task for host managed_node2 15406 1726854943.08664: done getting next task for host managed_node2 15406 1726854943.08668: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15406 1726854943.08669: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.08678: getting variables 15406 1726854943.08679: in VariableManager get_vars() 15406 1726854943.08714: Calling all_inventory to load vars for managed_node2 15406 1726854943.08717: Calling groups_inventory to load vars for managed_node2 15406 1726854943.08719: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.08727: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.08729: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.08732: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.09701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.10573: done with get_vars() 15406 1726854943.10591: done getting variables 15406 1726854943.10656: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:55:43 -0400 (0:00:00.044) 0:00:10.929 ****** 15406 1726854943.10677: entering _queue_task() for managed_node2/fail 15406 1726854943.10678: Creating lock for fail 15406 1726854943.10910: worker is 1 (out of 1 available) 15406 1726854943.10924: exiting _queue_task() for managed_node2/fail 15406 1726854943.10937: done queuing things up, now waiting for results queue to drain 15406 1726854943.10938: waiting for pending results... 15406 1726854943.11097: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15406 1726854943.11172: in run() - task 0affcc66-ac2b-3c83-32d3-000000000016 15406 1726854943.11183: variable 'ansible_search_path' from source: unknown 15406 1726854943.11186: variable 'ansible_search_path' from source: unknown 15406 1726854943.11218: calling self._execute() 15406 1726854943.11278: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.11283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.11297: variable 'omit' from source: magic vars 15406 1726854943.11560: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.11569: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.11654: variable 'network_state' from source: role '' defaults 15406 1726854943.11662: Evaluated conditional (network_state != {}): False 15406 1726854943.11666: when evaluation is False, skipping this task 15406 1726854943.11668: _execute() done 15406 1726854943.11671: dumping result to json 15406 1726854943.11673: done dumping result, returning 15406 1726854943.11680: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-3c83-32d3-000000000016] 15406 1726854943.11683: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000016 15406 1726854943.11770: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000016 15406 1726854943.11772: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854943.11853: no more pending results, returning what we have 15406 1726854943.11856: results queue empty 15406 1726854943.11857: checking for any_errors_fatal 15406 1726854943.11861: done checking for any_errors_fatal 15406 1726854943.11862: checking for max_fail_percentage 15406 1726854943.11863: done checking for max_fail_percentage 15406 1726854943.11864: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.11864: done checking to see if all hosts have failed 15406 1726854943.11865: getting the remaining hosts for this loop 15406 1726854943.11866: done getting the remaining hosts for this loop 15406 1726854943.11869: getting the next task for host managed_node2 15406 1726854943.11875: done getting next task for host managed_node2 15406 1726854943.11878: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15406 1726854943.11880: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.11896: getting variables 15406 1726854943.11897: in VariableManager get_vars() 15406 1726854943.11925: Calling all_inventory to load vars for managed_node2 15406 1726854943.11927: Calling groups_inventory to load vars for managed_node2 15406 1726854943.11929: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.11937: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.11940: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.11942: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.13157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.14428: done with get_vars() 15406 1726854943.14445: done getting variables 15406 1726854943.14489: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:55:43 -0400 (0:00:00.038) 0:00:10.967 ****** 15406 1726854943.14517: entering _queue_task() for managed_node2/fail 15406 1726854943.14820: worker is 1 (out of 1 available) 15406 1726854943.14835: exiting _queue_task() for managed_node2/fail 15406 1726854943.14847: done queuing things up, now waiting for results queue to drain 15406 1726854943.14849: waiting for pending results... 15406 1726854943.15059: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15406 1726854943.15134: in run() - task 0affcc66-ac2b-3c83-32d3-000000000017 15406 1726854943.15144: variable 'ansible_search_path' from source: unknown 15406 1726854943.15147: variable 'ansible_search_path' from source: unknown 15406 1726854943.15176: calling self._execute() 15406 1726854943.15245: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.15249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.15258: variable 'omit' from source: magic vars 15406 1726854943.15534: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.15541: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.15628: variable 'network_state' from source: role '' defaults 15406 1726854943.15641: Evaluated conditional (network_state != {}): False 15406 1726854943.15644: when evaluation is False, skipping this task 15406 1726854943.15647: _execute() done 15406 1726854943.15649: dumping result to json 15406 1726854943.15651: done dumping result, returning 15406 1726854943.15655: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-3c83-32d3-000000000017] 15406 1726854943.15683: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000017 15406 1726854943.15756: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000017 15406 1726854943.15759: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854943.15808: no more pending results, returning what we have 15406 1726854943.15812: results queue empty 15406 1726854943.15812: checking for any_errors_fatal 15406 1726854943.15819: done checking for any_errors_fatal 15406 1726854943.15820: checking for max_fail_percentage 15406 1726854943.15822: done checking for max_fail_percentage 15406 1726854943.15823: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.15823: done checking to see if all hosts have failed 15406 1726854943.15824: getting the remaining hosts for this loop 15406 1726854943.15825: done getting the remaining hosts for this loop 15406 1726854943.15829: getting the next task for host managed_node2 15406 1726854943.15834: done getting next task for host managed_node2 15406 1726854943.15837: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15406 1726854943.15840: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.15853: getting variables 15406 1726854943.15855: in VariableManager get_vars() 15406 1726854943.15891: Calling all_inventory to load vars for managed_node2 15406 1726854943.15893: Calling groups_inventory to load vars for managed_node2 15406 1726854943.15896: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.15903: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.15906: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.15908: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.16829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.17718: done with get_vars() 15406 1726854943.17738: done getting variables 15406 1726854943.17781: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:55:43 -0400 (0:00:00.032) 0:00:11.000 ****** 15406 1726854943.17810: entering _queue_task() for managed_node2/fail 15406 1726854943.18073: worker is 1 (out of 1 available) 15406 1726854943.18091: exiting _queue_task() for managed_node2/fail 15406 1726854943.18103: done queuing things up, now waiting for results queue to drain 15406 1726854943.18104: waiting for pending results... 15406 1726854943.18338: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15406 1726854943.18402: in run() - task 0affcc66-ac2b-3c83-32d3-000000000018 15406 1726854943.18421: variable 'ansible_search_path' from source: unknown 15406 1726854943.18425: variable 'ansible_search_path' from source: unknown 15406 1726854943.18467: calling self._execute() 15406 1726854943.18515: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.18520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.18531: variable 'omit' from source: magic vars 15406 1726854943.18841: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.18845: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.18983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.21012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.21135: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.21145: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.21306: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.21310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.21312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.21315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.21325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.21352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.21362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.21432: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.21443: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15406 1726854943.21521: variable 'ansible_distribution' from source: facts 15406 1726854943.21525: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.21532: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15406 1726854943.21684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.21704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.21727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.21749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.21760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.21796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.21813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.21835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.21859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.21870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.21903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.21919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.21944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.21963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.21974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.22160: variable 'network_connections' from source: play vars 15406 1726854943.22173: variable 'interface' from source: set_fact 15406 1726854943.22223: variable 'interface' from source: set_fact 15406 1726854943.22230: variable 'interface' from source: set_fact 15406 1726854943.22277: variable 'interface' from source: set_fact 15406 1726854943.22289: variable 'network_state' from source: role '' defaults 15406 1726854943.22338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854943.22447: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854943.22472: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854943.22511: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854943.22532: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854943.22563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854943.22582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854943.22608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.22625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854943.22651: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15406 1726854943.22654: when evaluation is False, skipping this task 15406 1726854943.22657: _execute() done 15406 1726854943.22659: dumping result to json 15406 1726854943.22661: done dumping result, returning 15406 1726854943.22668: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-3c83-32d3-000000000018] 15406 1726854943.22671: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000018 15406 1726854943.22754: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000018 15406 1726854943.22757: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15406 1726854943.22803: no more pending results, returning what we have 15406 1726854943.22807: results queue empty 15406 1726854943.22808: checking for any_errors_fatal 15406 1726854943.22813: done checking for any_errors_fatal 15406 1726854943.22814: checking for max_fail_percentage 15406 1726854943.22815: done checking for max_fail_percentage 15406 1726854943.22816: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.22817: done checking to see if all hosts have failed 15406 1726854943.22817: getting the remaining hosts for this loop 15406 1726854943.22819: done getting the remaining hosts for this loop 15406 1726854943.22822: getting the next task for host managed_node2 15406 1726854943.22828: done getting next task for host managed_node2 15406 1726854943.22831: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15406 1726854943.22833: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.22844: getting variables 15406 1726854943.22846: in VariableManager get_vars() 15406 1726854943.22881: Calling all_inventory to load vars for managed_node2 15406 1726854943.22884: Calling groups_inventory to load vars for managed_node2 15406 1726854943.22886: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.22896: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.22899: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.22901: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.23784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.24654: done with get_vars() 15406 1726854943.24668: done getting variables 15406 1726854943.24744: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:55:43 -0400 (0:00:00.069) 0:00:11.070 ****** 15406 1726854943.24764: entering _queue_task() for managed_node2/dnf 15406 1726854943.25005: worker is 1 (out of 1 available) 15406 1726854943.25019: exiting _queue_task() for managed_node2/dnf 15406 1726854943.25030: done queuing things up, now waiting for results queue to drain 15406 1726854943.25032: waiting for pending results... 15406 1726854943.25206: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15406 1726854943.25277: in run() - task 0affcc66-ac2b-3c83-32d3-000000000019 15406 1726854943.25283: variable 'ansible_search_path' from source: unknown 15406 1726854943.25291: variable 'ansible_search_path' from source: unknown 15406 1726854943.25317: calling self._execute() 15406 1726854943.25378: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.25393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.25402: variable 'omit' from source: magic vars 15406 1726854943.25669: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.25678: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.25817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.27342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.27385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.27416: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.27444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.27464: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.27525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.27546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.27569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.27597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.27608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.27693: variable 'ansible_distribution' from source: facts 15406 1726854943.27697: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.27708: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15406 1726854943.27783: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.27867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.27891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.27907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.27932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.27942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.27969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.27986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.28011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.28050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.28062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.28118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.28122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.28128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.28153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.28163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.28266: variable 'network_connections' from source: play vars 15406 1726854943.28276: variable 'interface' from source: set_fact 15406 1726854943.28327: variable 'interface' from source: set_fact 15406 1726854943.28338: variable 'interface' from source: set_fact 15406 1726854943.28376: variable 'interface' from source: set_fact 15406 1726854943.28425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854943.28547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854943.28576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854943.28608: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854943.28629: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854943.28666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854943.28680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854943.28704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.28721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854943.28762: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854943.28915: variable 'network_connections' from source: play vars 15406 1726854943.28919: variable 'interface' from source: set_fact 15406 1726854943.28961: variable 'interface' from source: set_fact 15406 1726854943.28967: variable 'interface' from source: set_fact 15406 1726854943.29014: variable 'interface' from source: set_fact 15406 1726854943.29037: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854943.29040: when evaluation is False, skipping this task 15406 1726854943.29043: _execute() done 15406 1726854943.29045: dumping result to json 15406 1726854943.29047: done dumping result, returning 15406 1726854943.29054: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000019] 15406 1726854943.29058: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000019 15406 1726854943.29145: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000019 15406 1726854943.29147: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854943.29198: no more pending results, returning what we have 15406 1726854943.29202: results queue empty 15406 1726854943.29203: checking for any_errors_fatal 15406 1726854943.29207: done checking for any_errors_fatal 15406 1726854943.29208: checking for max_fail_percentage 15406 1726854943.29210: done checking for max_fail_percentage 15406 1726854943.29211: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.29212: done checking to see if all hosts have failed 15406 1726854943.29212: getting the remaining hosts for this loop 15406 1726854943.29213: done getting the remaining hosts for this loop 15406 1726854943.29217: getting the next task for host managed_node2 15406 1726854943.29223: done getting next task for host managed_node2 15406 1726854943.29226: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15406 1726854943.29228: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.29240: getting variables 15406 1726854943.29242: in VariableManager get_vars() 15406 1726854943.29277: Calling all_inventory to load vars for managed_node2 15406 1726854943.29280: Calling groups_inventory to load vars for managed_node2 15406 1726854943.29282: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.29295: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.29297: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.29300: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.30077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.31534: done with get_vars() 15406 1726854943.31551: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15406 1726854943.31608: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:55:43 -0400 (0:00:00.068) 0:00:11.139 ****** 15406 1726854943.31629: entering _queue_task() for managed_node2/yum 15406 1726854943.31630: Creating lock for yum 15406 1726854943.31885: worker is 1 (out of 1 available) 15406 1726854943.31901: exiting _queue_task() for managed_node2/yum 15406 1726854943.31913: done queuing things up, now waiting for results queue to drain 15406 1726854943.31915: waiting for pending results... 15406 1726854943.32094: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15406 1726854943.32158: in run() - task 0affcc66-ac2b-3c83-32d3-00000000001a 15406 1726854943.32169: variable 'ansible_search_path' from source: unknown 15406 1726854943.32173: variable 'ansible_search_path' from source: unknown 15406 1726854943.32203: calling self._execute() 15406 1726854943.32268: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.32272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.32280: variable 'omit' from source: magic vars 15406 1726854943.32552: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.32562: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.32683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.34362: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.34498: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.34502: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.34523: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.34555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.34636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.34671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.34704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.34749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.34894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.34897: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.34900: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15406 1726854943.34904: when evaluation is False, skipping this task 15406 1726854943.34907: _execute() done 15406 1726854943.34909: dumping result to json 15406 1726854943.34912: done dumping result, returning 15406 1726854943.34925: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-00000000001a] 15406 1726854943.34936: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15406 1726854943.35114: no more pending results, returning what we have 15406 1726854943.35118: results queue empty 15406 1726854943.35118: checking for any_errors_fatal 15406 1726854943.35123: done checking for any_errors_fatal 15406 1726854943.35123: checking for max_fail_percentage 15406 1726854943.35125: done checking for max_fail_percentage 15406 1726854943.35126: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.35127: done checking to see if all hosts have failed 15406 1726854943.35127: getting the remaining hosts for this loop 15406 1726854943.35129: done getting the remaining hosts for this loop 15406 1726854943.35132: getting the next task for host managed_node2 15406 1726854943.35138: done getting next task for host managed_node2 15406 1726854943.35142: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15406 1726854943.35144: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.35156: getting variables 15406 1726854943.35157: in VariableManager get_vars() 15406 1726854943.35196: Calling all_inventory to load vars for managed_node2 15406 1726854943.35199: Calling groups_inventory to load vars for managed_node2 15406 1726854943.35201: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.35209: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.35211: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.35214: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.35802: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001a 15406 1726854943.35806: WORKER PROCESS EXITING 15406 1726854943.36007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.36875: done with get_vars() 15406 1726854943.36897: done getting variables 15406 1726854943.36943: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:55:43 -0400 (0:00:00.053) 0:00:11.192 ****** 15406 1726854943.36967: entering _queue_task() for managed_node2/fail 15406 1726854943.37207: worker is 1 (out of 1 available) 15406 1726854943.37222: exiting _queue_task() for managed_node2/fail 15406 1726854943.37236: done queuing things up, now waiting for results queue to drain 15406 1726854943.37238: waiting for pending results... 15406 1726854943.37408: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15406 1726854943.37475: in run() - task 0affcc66-ac2b-3c83-32d3-00000000001b 15406 1726854943.37490: variable 'ansible_search_path' from source: unknown 15406 1726854943.37494: variable 'ansible_search_path' from source: unknown 15406 1726854943.37590: calling self._execute() 15406 1726854943.37635: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.37647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.37660: variable 'omit' from source: magic vars 15406 1726854943.38047: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.38065: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.38237: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.38412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.40831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.40911: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.40953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.41192: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.41196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.41199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.41201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.41203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.41222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.41243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.41294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.41334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.41363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.41410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.41442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.41490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.41520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.41560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.41604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.41623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.41824: variable 'network_connections' from source: play vars 15406 1726854943.41841: variable 'interface' from source: set_fact 15406 1726854943.41932: variable 'interface' from source: set_fact 15406 1726854943.41989: variable 'interface' from source: set_fact 15406 1726854943.42024: variable 'interface' from source: set_fact 15406 1726854943.42112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854943.42658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854943.42702: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854943.42855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854943.42859: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854943.42861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854943.42866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854943.42899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.42930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854943.43010: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854943.43294: variable 'network_connections' from source: play vars 15406 1726854943.43305: variable 'interface' from source: set_fact 15406 1726854943.43363: variable 'interface' from source: set_fact 15406 1726854943.43374: variable 'interface' from source: set_fact 15406 1726854943.43443: variable 'interface' from source: set_fact 15406 1726854943.43478: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854943.43486: when evaluation is False, skipping this task 15406 1726854943.43507: _execute() done 15406 1726854943.43517: dumping result to json 15406 1726854943.43525: done dumping result, returning 15406 1726854943.43618: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-00000000001b] 15406 1726854943.43630: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001b 15406 1726854943.43701: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001b 15406 1726854943.43703: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854943.43757: no more pending results, returning what we have 15406 1726854943.43760: results queue empty 15406 1726854943.43761: checking for any_errors_fatal 15406 1726854943.43765: done checking for any_errors_fatal 15406 1726854943.43766: checking for max_fail_percentage 15406 1726854943.43768: done checking for max_fail_percentage 15406 1726854943.43768: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.43769: done checking to see if all hosts have failed 15406 1726854943.43770: getting the remaining hosts for this loop 15406 1726854943.43771: done getting the remaining hosts for this loop 15406 1726854943.43774: getting the next task for host managed_node2 15406 1726854943.43781: done getting next task for host managed_node2 15406 1726854943.43784: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15406 1726854943.43786: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.43800: getting variables 15406 1726854943.43802: in VariableManager get_vars() 15406 1726854943.43838: Calling all_inventory to load vars for managed_node2 15406 1726854943.43841: Calling groups_inventory to load vars for managed_node2 15406 1726854943.43843: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.43852: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.43855: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.43858: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.45745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.47430: done with get_vars() 15406 1726854943.47466: done getting variables 15406 1726854943.47531: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:55:43 -0400 (0:00:00.105) 0:00:11.298 ****** 15406 1726854943.47563: entering _queue_task() for managed_node2/package 15406 1726854943.48023: worker is 1 (out of 1 available) 15406 1726854943.48035: exiting _queue_task() for managed_node2/package 15406 1726854943.48045: done queuing things up, now waiting for results queue to drain 15406 1726854943.48047: waiting for pending results... 15406 1726854943.48467: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 15406 1726854943.48472: in run() - task 0affcc66-ac2b-3c83-32d3-00000000001c 15406 1726854943.48475: variable 'ansible_search_path' from source: unknown 15406 1726854943.48478: variable 'ansible_search_path' from source: unknown 15406 1726854943.48481: calling self._execute() 15406 1726854943.48534: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.48546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.48575: variable 'omit' from source: magic vars 15406 1726854943.48957: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.48976: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.49191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854943.49489: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854943.49550: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854943.49586: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854943.49655: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854943.49783: variable 'network_packages' from source: role '' defaults 15406 1726854943.49911: variable '__network_provider_setup' from source: role '' defaults 15406 1726854943.49994: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854943.50013: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854943.50027: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854943.50090: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854943.50291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.52300: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.52350: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.52390: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.52415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.52434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.52493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.52517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.52535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.52560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.52570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.52604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.52624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.52641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.52665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.52675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.52822: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15406 1726854943.52898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.52914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.52930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.52959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.52969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.53031: variable 'ansible_python' from source: facts 15406 1726854943.53054: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15406 1726854943.53112: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854943.53167: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854943.53247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.53269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.53285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.53312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.53322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.53353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.53376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.53398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.53422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.53433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.53531: variable 'network_connections' from source: play vars 15406 1726854943.53535: variable 'interface' from source: set_fact 15406 1726854943.53608: variable 'interface' from source: set_fact 15406 1726854943.53614: variable 'interface' from source: set_fact 15406 1726854943.53680: variable 'interface' from source: set_fact 15406 1726854943.53736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854943.53755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854943.53774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.53798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854943.53834: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.54012: variable 'network_connections' from source: play vars 15406 1726854943.54015: variable 'interface' from source: set_fact 15406 1726854943.54089: variable 'interface' from source: set_fact 15406 1726854943.54094: variable 'interface' from source: set_fact 15406 1726854943.54164: variable 'interface' from source: set_fact 15406 1726854943.54203: variable '__network_packages_default_wireless' from source: role '' defaults 15406 1726854943.54272: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.54510: variable 'network_connections' from source: play vars 15406 1726854943.54513: variable 'interface' from source: set_fact 15406 1726854943.54570: variable 'interface' from source: set_fact 15406 1726854943.54574: variable 'interface' from source: set_fact 15406 1726854943.54640: variable 'interface' from source: set_fact 15406 1726854943.54792: variable '__network_packages_default_team' from source: role '' defaults 15406 1726854943.54795: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854943.55023: variable 'network_connections' from source: play vars 15406 1726854943.55028: variable 'interface' from source: set_fact 15406 1726854943.55089: variable 'interface' from source: set_fact 15406 1726854943.55093: variable 'interface' from source: set_fact 15406 1726854943.55203: variable 'interface' from source: set_fact 15406 1726854943.55216: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854943.55264: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854943.55270: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854943.55327: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854943.55569: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15406 1726854943.55978: variable 'network_connections' from source: play vars 15406 1726854943.55981: variable 'interface' from source: set_fact 15406 1726854943.56040: variable 'interface' from source: set_fact 15406 1726854943.56046: variable 'interface' from source: set_fact 15406 1726854943.56131: variable 'interface' from source: set_fact 15406 1726854943.56137: variable 'ansible_distribution' from source: facts 15406 1726854943.56140: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.56142: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.56293: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15406 1726854943.56310: variable 'ansible_distribution' from source: facts 15406 1726854943.56313: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.56315: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.56328: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15406 1726854943.56477: variable 'ansible_distribution' from source: facts 15406 1726854943.56481: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.56484: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.56550: variable 'network_provider' from source: set_fact 15406 1726854943.56553: variable 'ansible_facts' from source: unknown 15406 1726854943.56966: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15406 1726854943.56970: when evaluation is False, skipping this task 15406 1726854943.56972: _execute() done 15406 1726854943.56975: dumping result to json 15406 1726854943.56977: done dumping result, returning 15406 1726854943.56984: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-3c83-32d3-00000000001c] 15406 1726854943.56991: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001c 15406 1726854943.57077: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001c 15406 1726854943.57080: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15406 1726854943.57133: no more pending results, returning what we have 15406 1726854943.57137: results queue empty 15406 1726854943.57138: checking for any_errors_fatal 15406 1726854943.57144: done checking for any_errors_fatal 15406 1726854943.57144: checking for max_fail_percentage 15406 1726854943.57146: done checking for max_fail_percentage 15406 1726854943.57147: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.57148: done checking to see if all hosts have failed 15406 1726854943.57148: getting the remaining hosts for this loop 15406 1726854943.57150: done getting the remaining hosts for this loop 15406 1726854943.57153: getting the next task for host managed_node2 15406 1726854943.57159: done getting next task for host managed_node2 15406 1726854943.57163: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15406 1726854943.57165: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.57176: getting variables 15406 1726854943.57178: in VariableManager get_vars() 15406 1726854943.57217: Calling all_inventory to load vars for managed_node2 15406 1726854943.57219: Calling groups_inventory to load vars for managed_node2 15406 1726854943.57222: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.57235: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.57237: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.57240: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.58033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.58895: done with get_vars() 15406 1726854943.58913: done getting variables 15406 1726854943.58958: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:55:43 -0400 (0:00:00.114) 0:00:11.412 ****** 15406 1726854943.58980: entering _queue_task() for managed_node2/package 15406 1726854943.59266: worker is 1 (out of 1 available) 15406 1726854943.59280: exiting _queue_task() for managed_node2/package 15406 1726854943.59296: done queuing things up, now waiting for results queue to drain 15406 1726854943.59298: waiting for pending results... 15406 1726854943.59612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15406 1726854943.59677: in run() - task 0affcc66-ac2b-3c83-32d3-00000000001d 15406 1726854943.59707: variable 'ansible_search_path' from source: unknown 15406 1726854943.59716: variable 'ansible_search_path' from source: unknown 15406 1726854943.59814: calling self._execute() 15406 1726854943.59863: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.59876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.59897: variable 'omit' from source: magic vars 15406 1726854943.60246: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.60257: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.60342: variable 'network_state' from source: role '' defaults 15406 1726854943.60351: Evaluated conditional (network_state != {}): False 15406 1726854943.60354: when evaluation is False, skipping this task 15406 1726854943.60358: _execute() done 15406 1726854943.60361: dumping result to json 15406 1726854943.60363: done dumping result, returning 15406 1726854943.60370: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-3c83-32d3-00000000001d] 15406 1726854943.60375: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001d 15406 1726854943.60461: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001d 15406 1726854943.60463: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854943.60534: no more pending results, returning what we have 15406 1726854943.60538: results queue empty 15406 1726854943.60539: checking for any_errors_fatal 15406 1726854943.60546: done checking for any_errors_fatal 15406 1726854943.60547: checking for max_fail_percentage 15406 1726854943.60548: done checking for max_fail_percentage 15406 1726854943.60549: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.60550: done checking to see if all hosts have failed 15406 1726854943.60550: getting the remaining hosts for this loop 15406 1726854943.60552: done getting the remaining hosts for this loop 15406 1726854943.60555: getting the next task for host managed_node2 15406 1726854943.60560: done getting next task for host managed_node2 15406 1726854943.60563: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15406 1726854943.60565: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.60578: getting variables 15406 1726854943.60579: in VariableManager get_vars() 15406 1726854943.60613: Calling all_inventory to load vars for managed_node2 15406 1726854943.60616: Calling groups_inventory to load vars for managed_node2 15406 1726854943.60618: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.60626: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.60628: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.60631: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.65626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.66536: done with get_vars() 15406 1726854943.66558: done getting variables 15406 1726854943.66598: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:55:43 -0400 (0:00:00.076) 0:00:11.488 ****** 15406 1726854943.66617: entering _queue_task() for managed_node2/package 15406 1726854943.66871: worker is 1 (out of 1 available) 15406 1726854943.66884: exiting _queue_task() for managed_node2/package 15406 1726854943.66900: done queuing things up, now waiting for results queue to drain 15406 1726854943.66902: waiting for pending results... 15406 1726854943.67066: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15406 1726854943.67143: in run() - task 0affcc66-ac2b-3c83-32d3-00000000001e 15406 1726854943.67155: variable 'ansible_search_path' from source: unknown 15406 1726854943.67159: variable 'ansible_search_path' from source: unknown 15406 1726854943.67184: calling self._execute() 15406 1726854943.67255: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.67260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.67269: variable 'omit' from source: magic vars 15406 1726854943.67550: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.67560: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.67645: variable 'network_state' from source: role '' defaults 15406 1726854943.67655: Evaluated conditional (network_state != {}): False 15406 1726854943.67658: when evaluation is False, skipping this task 15406 1726854943.67661: _execute() done 15406 1726854943.67663: dumping result to json 15406 1726854943.67666: done dumping result, returning 15406 1726854943.67674: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-3c83-32d3-00000000001e] 15406 1726854943.67679: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001e 15406 1726854943.67767: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001e 15406 1726854943.67770: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854943.67840: no more pending results, returning what we have 15406 1726854943.67843: results queue empty 15406 1726854943.67844: checking for any_errors_fatal 15406 1726854943.67852: done checking for any_errors_fatal 15406 1726854943.67853: checking for max_fail_percentage 15406 1726854943.67855: done checking for max_fail_percentage 15406 1726854943.67855: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.67856: done checking to see if all hosts have failed 15406 1726854943.67857: getting the remaining hosts for this loop 15406 1726854943.67858: done getting the remaining hosts for this loop 15406 1726854943.67861: getting the next task for host managed_node2 15406 1726854943.67868: done getting next task for host managed_node2 15406 1726854943.67871: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15406 1726854943.67873: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.67889: getting variables 15406 1726854943.67890: in VariableManager get_vars() 15406 1726854943.67924: Calling all_inventory to load vars for managed_node2 15406 1726854943.67926: Calling groups_inventory to load vars for managed_node2 15406 1726854943.67928: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.67936: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.67938: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.67940: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.68702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.69595: done with get_vars() 15406 1726854943.69610: done getting variables 15406 1726854943.69682: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:55:43 -0400 (0:00:00.030) 0:00:11.519 ****** 15406 1726854943.69706: entering _queue_task() for managed_node2/service 15406 1726854943.69707: Creating lock for service 15406 1726854943.69942: worker is 1 (out of 1 available) 15406 1726854943.69957: exiting _queue_task() for managed_node2/service 15406 1726854943.69970: done queuing things up, now waiting for results queue to drain 15406 1726854943.69971: waiting for pending results... 15406 1726854943.70134: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15406 1726854943.70207: in run() - task 0affcc66-ac2b-3c83-32d3-00000000001f 15406 1726854943.70213: variable 'ansible_search_path' from source: unknown 15406 1726854943.70216: variable 'ansible_search_path' from source: unknown 15406 1726854943.70245: calling self._execute() 15406 1726854943.70315: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.70324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.70330: variable 'omit' from source: magic vars 15406 1726854943.70603: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.70612: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.70701: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.70828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.72497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.72546: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.72572: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.72603: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.72624: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.72680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.72709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.72724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.72750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.72761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.72795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.72816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.72832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.72856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.72866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.72896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.72915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.72934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.72957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.72967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.73080: variable 'network_connections' from source: play vars 15406 1726854943.73092: variable 'interface' from source: set_fact 15406 1726854943.73147: variable 'interface' from source: set_fact 15406 1726854943.73155: variable 'interface' from source: set_fact 15406 1726854943.73200: variable 'interface' from source: set_fact 15406 1726854943.73249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854943.73365: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854943.73396: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854943.73428: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854943.73450: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854943.73485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854943.73504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854943.73521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.73538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854943.73586: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854943.73739: variable 'network_connections' from source: play vars 15406 1726854943.73743: variable 'interface' from source: set_fact 15406 1726854943.73786: variable 'interface' from source: set_fact 15406 1726854943.73798: variable 'interface' from source: set_fact 15406 1726854943.73837: variable 'interface' from source: set_fact 15406 1726854943.73861: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854943.73864: when evaluation is False, skipping this task 15406 1726854943.73866: _execute() done 15406 1726854943.73869: dumping result to json 15406 1726854943.73871: done dumping result, returning 15406 1726854943.73880: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-00000000001f] 15406 1726854943.73894: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001f 15406 1726854943.73973: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000001f 15406 1726854943.73976: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854943.74048: no more pending results, returning what we have 15406 1726854943.74051: results queue empty 15406 1726854943.74052: checking for any_errors_fatal 15406 1726854943.74059: done checking for any_errors_fatal 15406 1726854943.74060: checking for max_fail_percentage 15406 1726854943.74061: done checking for max_fail_percentage 15406 1726854943.74062: checking to see if all hosts have failed and the running result is not ok 15406 1726854943.74063: done checking to see if all hosts have failed 15406 1726854943.74063: getting the remaining hosts for this loop 15406 1726854943.74065: done getting the remaining hosts for this loop 15406 1726854943.74069: getting the next task for host managed_node2 15406 1726854943.74075: done getting next task for host managed_node2 15406 1726854943.74078: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15406 1726854943.74080: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854943.74094: getting variables 15406 1726854943.74096: in VariableManager get_vars() 15406 1726854943.74132: Calling all_inventory to load vars for managed_node2 15406 1726854943.74134: Calling groups_inventory to load vars for managed_node2 15406 1726854943.74136: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854943.74145: Calling all_plugins_play to load vars for managed_node2 15406 1726854943.74147: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854943.74150: Calling groups_plugins_play to load vars for managed_node2 15406 1726854943.75048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854943.75933: done with get_vars() 15406 1726854943.75948: done getting variables 15406 1726854943.75994: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:55:43 -0400 (0:00:00.063) 0:00:11.582 ****** 15406 1726854943.76015: entering _queue_task() for managed_node2/service 15406 1726854943.76252: worker is 1 (out of 1 available) 15406 1726854943.76266: exiting _queue_task() for managed_node2/service 15406 1726854943.76279: done queuing things up, now waiting for results queue to drain 15406 1726854943.76280: waiting for pending results... 15406 1726854943.76450: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15406 1726854943.76518: in run() - task 0affcc66-ac2b-3c83-32d3-000000000020 15406 1726854943.76528: variable 'ansible_search_path' from source: unknown 15406 1726854943.76531: variable 'ansible_search_path' from source: unknown 15406 1726854943.76558: calling self._execute() 15406 1726854943.76629: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.76633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.76642: variable 'omit' from source: magic vars 15406 1726854943.76913: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.76922: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854943.77029: variable 'network_provider' from source: set_fact 15406 1726854943.77032: variable 'network_state' from source: role '' defaults 15406 1726854943.77041: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15406 1726854943.77052: variable 'omit' from source: magic vars 15406 1726854943.77073: variable 'omit' from source: magic vars 15406 1726854943.77096: variable 'network_service_name' from source: role '' defaults 15406 1726854943.77146: variable 'network_service_name' from source: role '' defaults 15406 1726854943.77220: variable '__network_provider_setup' from source: role '' defaults 15406 1726854943.77224: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854943.77272: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854943.77281: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854943.77323: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854943.77467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854943.78899: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854943.78953: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854943.78979: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854943.79008: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854943.79032: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854943.79092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.79112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.79134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.79158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.79169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.79202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.79218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.79242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.79263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.79273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.79420: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15406 1726854943.79496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.79514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.79530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.79554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.79570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.79629: variable 'ansible_python' from source: facts 15406 1726854943.79646: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15406 1726854943.79706: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854943.79758: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854943.79842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.79859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.79875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.79908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.79918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.79949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854943.79969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854943.79985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.80018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854943.80028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854943.80118: variable 'network_connections' from source: play vars 15406 1726854943.80125: variable 'interface' from source: set_fact 15406 1726854943.80174: variable 'interface' from source: set_fact 15406 1726854943.80183: variable 'interface' from source: set_fact 15406 1726854943.80238: variable 'interface' from source: set_fact 15406 1726854943.80310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854943.80444: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854943.80476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854943.80511: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854943.80541: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854943.80590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854943.80610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854943.80631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854943.80655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854943.80693: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.80865: variable 'network_connections' from source: play vars 15406 1726854943.80870: variable 'interface' from source: set_fact 15406 1726854943.80927: variable 'interface' from source: set_fact 15406 1726854943.80935: variable 'interface' from source: set_fact 15406 1726854943.80990: variable 'interface' from source: set_fact 15406 1726854943.81023: variable '__network_packages_default_wireless' from source: role '' defaults 15406 1726854943.81075: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854943.81259: variable 'network_connections' from source: play vars 15406 1726854943.81262: variable 'interface' from source: set_fact 15406 1726854943.81313: variable 'interface' from source: set_fact 15406 1726854943.81320: variable 'interface' from source: set_fact 15406 1726854943.81369: variable 'interface' from source: set_fact 15406 1726854943.81390: variable '__network_packages_default_team' from source: role '' defaults 15406 1726854943.81444: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854943.81624: variable 'network_connections' from source: play vars 15406 1726854943.81627: variable 'interface' from source: set_fact 15406 1726854943.81678: variable 'interface' from source: set_fact 15406 1726854943.81684: variable 'interface' from source: set_fact 15406 1726854943.81733: variable 'interface' from source: set_fact 15406 1726854943.81776: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854943.81820: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854943.81826: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854943.81868: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854943.82003: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15406 1726854943.82429: variable 'network_connections' from source: play vars 15406 1726854943.82433: variable 'interface' from source: set_fact 15406 1726854943.82473: variable 'interface' from source: set_fact 15406 1726854943.82479: variable 'interface' from source: set_fact 15406 1726854943.82522: variable 'interface' from source: set_fact 15406 1726854943.82532: variable 'ansible_distribution' from source: facts 15406 1726854943.82535: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.82537: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.82562: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15406 1726854943.82673: variable 'ansible_distribution' from source: facts 15406 1726854943.82677: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.82682: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.82693: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15406 1726854943.82805: variable 'ansible_distribution' from source: facts 15406 1726854943.82808: variable '__network_rh_distros' from source: role '' defaults 15406 1726854943.82813: variable 'ansible_distribution_major_version' from source: facts 15406 1726854943.82837: variable 'network_provider' from source: set_fact 15406 1726854943.82857: variable 'omit' from source: magic vars 15406 1726854943.82876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854943.82898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854943.82913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854943.82925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854943.82933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854943.82954: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854943.82957: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.82960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.83029: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854943.83034: Set connection var ansible_timeout to 10 15406 1726854943.83037: Set connection var ansible_connection to ssh 15406 1726854943.83041: Set connection var ansible_shell_type to sh 15406 1726854943.83046: Set connection var ansible_shell_executable to /bin/sh 15406 1726854943.83052: Set connection var ansible_pipelining to False 15406 1726854943.83072: variable 'ansible_shell_executable' from source: unknown 15406 1726854943.83076: variable 'ansible_connection' from source: unknown 15406 1726854943.83079: variable 'ansible_module_compression' from source: unknown 15406 1726854943.83082: variable 'ansible_shell_type' from source: unknown 15406 1726854943.83089: variable 'ansible_shell_executable' from source: unknown 15406 1726854943.83091: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854943.83099: variable 'ansible_pipelining' from source: unknown 15406 1726854943.83101: variable 'ansible_timeout' from source: unknown 15406 1726854943.83103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854943.83164: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854943.83172: variable 'omit' from source: magic vars 15406 1726854943.83178: starting attempt loop 15406 1726854943.83181: running the handler 15406 1726854943.83238: variable 'ansible_facts' from source: unknown 15406 1726854943.83664: _low_level_execute_command(): starting 15406 1726854943.83670: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854943.84183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854943.84193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854943.84197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854943.84199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854943.84251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854943.84254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854943.84343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854943.86137: stdout chunk (state=3): >>>/root <<< 15406 1726854943.86236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854943.86266: stderr chunk (state=3): >>><<< 15406 1726854943.86269: stdout chunk (state=3): >>><<< 15406 1726854943.86294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854943.86304: _low_level_execute_command(): starting 15406 1726854943.86310: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491 `" && echo ansible-tmp-1726854943.8629353-16045-215359925910491="` echo /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491 `" ) && sleep 0' 15406 1726854943.86761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854943.86764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854943.86766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854943.86768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854943.86770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854943.86772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854943.86833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854943.86837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854943.86840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854943.86902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854943.88853: stdout chunk (state=3): >>>ansible-tmp-1726854943.8629353-16045-215359925910491=/root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491 <<< 15406 1726854943.88983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854943.88990: stdout chunk (state=3): >>><<< 15406 1726854943.88994: stderr chunk (state=3): >>><<< 15406 1726854943.89012: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854943.8629353-16045-215359925910491=/root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854943.89037: variable 'ansible_module_compression' from source: unknown 15406 1726854943.89078: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15406 1726854943.89082: ANSIBALLZ: Acquiring lock 15406 1726854943.89095: ANSIBALLZ: Lock acquired: 140626835985552 15406 1726854943.89097: ANSIBALLZ: Creating module 15406 1726854944.14197: ANSIBALLZ: Writing module into payload 15406 1726854944.14356: ANSIBALLZ: Writing module 15406 1726854944.14373: ANSIBALLZ: Renaming module 15406 1726854944.14379: ANSIBALLZ: Done creating module 15406 1726854944.14416: variable 'ansible_facts' from source: unknown 15406 1726854944.14676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py 15406 1726854944.14710: Sending initial data 15406 1726854944.14713: Sent initial data (156 bytes) 15406 1726854944.15393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854944.15418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854944.15523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854944.17202: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854944.17276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854944.17348: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpsqxrzaop /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py <<< 15406 1726854944.17365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py" <<< 15406 1726854944.17475: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15406 1726854944.17492: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpsqxrzaop" to remote "/root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py" <<< 15406 1726854944.18915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854944.18954: stderr chunk (state=3): >>><<< 15406 1726854944.18958: stdout chunk (state=3): >>><<< 15406 1726854944.18983: done transferring module to remote 15406 1726854944.18995: _low_level_execute_command(): starting 15406 1726854944.19000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/ /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py && sleep 0' 15406 1726854944.19440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854944.19444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854944.19446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 15406 1726854944.19448: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854944.19450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854944.19502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854944.19506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854944.19581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854944.21412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854944.21415: stdout chunk (state=3): >>><<< 15406 1726854944.21418: stderr chunk (state=3): >>><<< 15406 1726854944.21421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854944.21423: _low_level_execute_command(): starting 15406 1726854944.21425: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/AnsiballZ_systemd.py && sleep 0' 15406 1726854944.22000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854944.22003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854944.22038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854944.22045: stderr chunk (state=3): >>>debug2: match found <<< 15406 1726854944.22112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854944.22134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854944.22138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854944.22158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854944.22252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854944.51119: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4505600", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324833792", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1002291000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 15406 1726854944.51140: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "netw<<< 15406 1726854944.51148: stdout chunk (state=3): >>>ork-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15406 1726854944.53097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854944.53101: stdout chunk (state=3): >>><<< 15406 1726854944.53103: stderr chunk (state=3): >>><<< 15406 1726854944.53296: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4505600", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324833792", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1002291000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854944.53341: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854944.53368: _low_level_execute_command(): starting 15406 1726854944.53377: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854943.8629353-16045-215359925910491/ > /dev/null 2>&1 && sleep 0' 15406 1726854944.53963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854944.53977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854944.53991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854944.54038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854944.54050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854944.54122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854944.55964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854944.55985: stderr chunk (state=3): >>><<< 15406 1726854944.55991: stdout chunk (state=3): >>><<< 15406 1726854944.56005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854944.56013: handler run complete 15406 1726854944.56051: attempt loop complete, returning result 15406 1726854944.56054: _execute() done 15406 1726854944.56056: dumping result to json 15406 1726854944.56068: done dumping result, returning 15406 1726854944.56076: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-3c83-32d3-000000000020] 15406 1726854944.56081: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000020 15406 1726854944.56314: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000020 15406 1726854944.56317: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854944.56363: no more pending results, returning what we have 15406 1726854944.56366: results queue empty 15406 1726854944.56367: checking for any_errors_fatal 15406 1726854944.56375: done checking for any_errors_fatal 15406 1726854944.56375: checking for max_fail_percentage 15406 1726854944.56377: done checking for max_fail_percentage 15406 1726854944.56378: checking to see if all hosts have failed and the running result is not ok 15406 1726854944.56379: done checking to see if all hosts have failed 15406 1726854944.56379: getting the remaining hosts for this loop 15406 1726854944.56381: done getting the remaining hosts for this loop 15406 1726854944.56384: getting the next task for host managed_node2 15406 1726854944.56392: done getting next task for host managed_node2 15406 1726854944.56395: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15406 1726854944.56396: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854944.56406: getting variables 15406 1726854944.56407: in VariableManager get_vars() 15406 1726854944.56440: Calling all_inventory to load vars for managed_node2 15406 1726854944.56443: Calling groups_inventory to load vars for managed_node2 15406 1726854944.56445: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854944.56454: Calling all_plugins_play to load vars for managed_node2 15406 1726854944.56457: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854944.56459: Calling groups_plugins_play to load vars for managed_node2 15406 1726854944.57832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854944.59405: done with get_vars() 15406 1726854944.59426: done getting variables 15406 1726854944.59491: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:55:44 -0400 (0:00:00.835) 0:00:12.417 ****** 15406 1726854944.59520: entering _queue_task() for managed_node2/service 15406 1726854944.59859: worker is 1 (out of 1 available) 15406 1726854944.59872: exiting _queue_task() for managed_node2/service 15406 1726854944.59884: done queuing things up, now waiting for results queue to drain 15406 1726854944.59886: waiting for pending results... 15406 1726854944.60322: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15406 1726854944.60328: in run() - task 0affcc66-ac2b-3c83-32d3-000000000021 15406 1726854944.60331: variable 'ansible_search_path' from source: unknown 15406 1726854944.60334: variable 'ansible_search_path' from source: unknown 15406 1726854944.60336: calling self._execute() 15406 1726854944.60453: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854944.60457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854944.60460: variable 'omit' from source: magic vars 15406 1726854944.60964: variable 'ansible_distribution_major_version' from source: facts 15406 1726854944.60968: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854944.61060: variable 'network_provider' from source: set_fact 15406 1726854944.61086: Evaluated conditional (network_provider == "nm"): True 15406 1726854944.61200: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854944.61338: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854944.61558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854944.65993: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854944.65997: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854944.66039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854944.66081: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854944.66294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854944.66438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854944.66475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854944.66561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854944.66660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854944.66682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854944.66799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854944.66886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854944.66921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854944.67013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854944.67100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854944.67141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854944.67213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854944.67412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854944.67415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854944.67417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854944.67797: variable 'network_connections' from source: play vars 15406 1726854944.67817: variable 'interface' from source: set_fact 15406 1726854944.68009: variable 'interface' from source: set_fact 15406 1726854944.68029: variable 'interface' from source: set_fact 15406 1726854944.68177: variable 'interface' from source: set_fact 15406 1726854944.68395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854944.68750: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854944.68814: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854944.68858: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854944.68901: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854944.68954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854944.68994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854944.69012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854944.69047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854944.69154: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854944.69607: variable 'network_connections' from source: play vars 15406 1726854944.69697: variable 'interface' from source: set_fact 15406 1726854944.69780: variable 'interface' from source: set_fact 15406 1726854944.69800: variable 'interface' from source: set_fact 15406 1726854944.70099: variable 'interface' from source: set_fact 15406 1726854944.70102: Evaluated conditional (__network_wpa_supplicant_required): False 15406 1726854944.70105: when evaluation is False, skipping this task 15406 1726854944.70107: _execute() done 15406 1726854944.70117: dumping result to json 15406 1726854944.70120: done dumping result, returning 15406 1726854944.70122: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-3c83-32d3-000000000021] 15406 1726854944.70124: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000021 15406 1726854944.70195: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000021 15406 1726854944.70205: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15406 1726854944.70254: no more pending results, returning what we have 15406 1726854944.70258: results queue empty 15406 1726854944.70259: checking for any_errors_fatal 15406 1726854944.70286: done checking for any_errors_fatal 15406 1726854944.70290: checking for max_fail_percentage 15406 1726854944.70292: done checking for max_fail_percentage 15406 1726854944.70293: checking to see if all hosts have failed and the running result is not ok 15406 1726854944.70294: done checking to see if all hosts have failed 15406 1726854944.70295: getting the remaining hosts for this loop 15406 1726854944.70296: done getting the remaining hosts for this loop 15406 1726854944.70300: getting the next task for host managed_node2 15406 1726854944.70315: done getting next task for host managed_node2 15406 1726854944.70319: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15406 1726854944.70321: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854944.70334: getting variables 15406 1726854944.70336: in VariableManager get_vars() 15406 1726854944.70376: Calling all_inventory to load vars for managed_node2 15406 1726854944.70379: Calling groups_inventory to load vars for managed_node2 15406 1726854944.70382: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854944.70590: Calling all_plugins_play to load vars for managed_node2 15406 1726854944.70594: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854944.70599: Calling groups_plugins_play to load vars for managed_node2 15406 1726854944.74700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854944.77390: done with get_vars() 15406 1726854944.77419: done getting variables 15406 1726854944.77492: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:55:44 -0400 (0:00:00.180) 0:00:12.597 ****** 15406 1726854944.77523: entering _queue_task() for managed_node2/service 15406 1726854944.77927: worker is 1 (out of 1 available) 15406 1726854944.77939: exiting _queue_task() for managed_node2/service 15406 1726854944.77951: done queuing things up, now waiting for results queue to drain 15406 1726854944.77952: waiting for pending results... 15406 1726854944.78227: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 15406 1726854944.78300: in run() - task 0affcc66-ac2b-3c83-32d3-000000000022 15406 1726854944.78326: variable 'ansible_search_path' from source: unknown 15406 1726854944.78346: variable 'ansible_search_path' from source: unknown 15406 1726854944.78433: calling self._execute() 15406 1726854944.78474: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854944.78488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854944.78504: variable 'omit' from source: magic vars 15406 1726854944.78912: variable 'ansible_distribution_major_version' from source: facts 15406 1726854944.78928: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854944.79043: variable 'network_provider' from source: set_fact 15406 1726854944.79053: Evaluated conditional (network_provider == "initscripts"): False 15406 1726854944.79059: when evaluation is False, skipping this task 15406 1726854944.79065: _execute() done 15406 1726854944.79070: dumping result to json 15406 1726854944.79101: done dumping result, returning 15406 1726854944.79104: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-3c83-32d3-000000000022] 15406 1726854944.79109: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000022 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854944.79347: no more pending results, returning what we have 15406 1726854944.79352: results queue empty 15406 1726854944.79353: checking for any_errors_fatal 15406 1726854944.79365: done checking for any_errors_fatal 15406 1726854944.79366: checking for max_fail_percentage 15406 1726854944.79368: done checking for max_fail_percentage 15406 1726854944.79370: checking to see if all hosts have failed and the running result is not ok 15406 1726854944.79371: done checking to see if all hosts have failed 15406 1726854944.79371: getting the remaining hosts for this loop 15406 1726854944.79373: done getting the remaining hosts for this loop 15406 1726854944.79377: getting the next task for host managed_node2 15406 1726854944.79384: done getting next task for host managed_node2 15406 1726854944.79390: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15406 1726854944.79395: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854944.79415: getting variables 15406 1726854944.79417: in VariableManager get_vars() 15406 1726854944.79459: Calling all_inventory to load vars for managed_node2 15406 1726854944.79462: Calling groups_inventory to load vars for managed_node2 15406 1726854944.79465: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854944.79479: Calling all_plugins_play to load vars for managed_node2 15406 1726854944.79482: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854944.79485: Calling groups_plugins_play to load vars for managed_node2 15406 1726854944.79690: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000022 15406 1726854944.79694: WORKER PROCESS EXITING 15406 1726854944.81190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854944.83058: done with get_vars() 15406 1726854944.83192: done getting variables 15406 1726854944.83256: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:55:44 -0400 (0:00:00.057) 0:00:12.655 ****** 15406 1726854944.83409: entering _queue_task() for managed_node2/copy 15406 1726854944.83901: worker is 1 (out of 1 available) 15406 1726854944.83951: exiting _queue_task() for managed_node2/copy 15406 1726854944.83964: done queuing things up, now waiting for results queue to drain 15406 1726854944.83966: waiting for pending results... 15406 1726854944.84234: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15406 1726854944.84327: in run() - task 0affcc66-ac2b-3c83-32d3-000000000023 15406 1726854944.84347: variable 'ansible_search_path' from source: unknown 15406 1726854944.84351: variable 'ansible_search_path' from source: unknown 15406 1726854944.84381: calling self._execute() 15406 1726854944.84563: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854944.84567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854944.84576: variable 'omit' from source: magic vars 15406 1726854944.84889: variable 'ansible_distribution_major_version' from source: facts 15406 1726854944.84897: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854944.84995: variable 'network_provider' from source: set_fact 15406 1726854944.85004: Evaluated conditional (network_provider == "initscripts"): False 15406 1726854944.85008: when evaluation is False, skipping this task 15406 1726854944.85011: _execute() done 15406 1726854944.85013: dumping result to json 15406 1726854944.85016: done dumping result, returning 15406 1726854944.85025: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-3c83-32d3-000000000023] 15406 1726854944.85028: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000023 15406 1726854944.85210: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000023 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15406 1726854944.85255: no more pending results, returning what we have 15406 1726854944.85259: results queue empty 15406 1726854944.85259: checking for any_errors_fatal 15406 1726854944.85264: done checking for any_errors_fatal 15406 1726854944.85265: checking for max_fail_percentage 15406 1726854944.85266: done checking for max_fail_percentage 15406 1726854944.85267: checking to see if all hosts have failed and the running result is not ok 15406 1726854944.85267: done checking to see if all hosts have failed 15406 1726854944.85268: getting the remaining hosts for this loop 15406 1726854944.85269: done getting the remaining hosts for this loop 15406 1726854944.85272: getting the next task for host managed_node2 15406 1726854944.85277: done getting next task for host managed_node2 15406 1726854944.85280: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15406 1726854944.85281: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854944.85296: getting variables 15406 1726854944.85297: in VariableManager get_vars() 15406 1726854944.85333: Calling all_inventory to load vars for managed_node2 15406 1726854944.85336: Calling groups_inventory to load vars for managed_node2 15406 1726854944.85338: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854944.85346: Calling all_plugins_play to load vars for managed_node2 15406 1726854944.85349: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854944.85351: Calling groups_plugins_play to load vars for managed_node2 15406 1726854944.85934: WORKER PROCESS EXITING 15406 1726854944.87239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854944.89257: done with get_vars() 15406 1726854944.89277: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:55:44 -0400 (0:00:00.060) 0:00:12.716 ****** 15406 1726854944.89370: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15406 1726854944.89372: Creating lock for fedora.linux_system_roles.network_connections 15406 1726854944.89725: worker is 1 (out of 1 available) 15406 1726854944.89738: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15406 1726854944.89798: done queuing things up, now waiting for results queue to drain 15406 1726854944.89800: waiting for pending results... 15406 1726854944.90165: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15406 1726854944.90362: in run() - task 0affcc66-ac2b-3c83-32d3-000000000024 15406 1726854944.90374: variable 'ansible_search_path' from source: unknown 15406 1726854944.90378: variable 'ansible_search_path' from source: unknown 15406 1726854944.90525: calling self._execute() 15406 1726854944.90639: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854944.90643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854944.90652: variable 'omit' from source: magic vars 15406 1726854944.91488: variable 'ansible_distribution_major_version' from source: facts 15406 1726854944.91501: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854944.91510: variable 'omit' from source: magic vars 15406 1726854944.91604: variable 'omit' from source: magic vars 15406 1726854944.91983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854944.96295: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854944.96356: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854944.96506: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854944.96540: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854944.96565: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854944.96744: variable 'network_provider' from source: set_fact 15406 1726854944.96989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854944.97080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854944.97145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854944.97261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854944.97276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854944.97369: variable 'omit' from source: magic vars 15406 1726854944.97574: variable 'omit' from source: magic vars 15406 1726854944.97993: variable 'network_connections' from source: play vars 15406 1726854944.97996: variable 'interface' from source: set_fact 15406 1726854944.97999: variable 'interface' from source: set_fact 15406 1726854944.98002: variable 'interface' from source: set_fact 15406 1726854944.98144: variable 'interface' from source: set_fact 15406 1726854944.98452: variable 'omit' from source: magic vars 15406 1726854944.98571: variable '__lsr_ansible_managed' from source: task vars 15406 1726854944.98633: variable '__lsr_ansible_managed' from source: task vars 15406 1726854944.99033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15406 1726854944.99380: Loaded config def from plugin (lookup/template) 15406 1726854944.99384: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15406 1726854944.99443: File lookup term: get_ansible_managed.j2 15406 1726854944.99446: variable 'ansible_search_path' from source: unknown 15406 1726854944.99452: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15406 1726854944.99488: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15406 1726854944.99507: variable 'ansible_search_path' from source: unknown 15406 1726854945.07706: variable 'ansible_managed' from source: unknown 15406 1726854945.07831: variable 'omit' from source: magic vars 15406 1726854945.07860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854945.07886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854945.07908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854945.07925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854945.07936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854945.07968: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854945.07971: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854945.07973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854945.08071: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854945.08077: Set connection var ansible_timeout to 10 15406 1726854945.08080: Set connection var ansible_connection to ssh 15406 1726854945.08095: Set connection var ansible_shell_type to sh 15406 1726854945.08098: Set connection var ansible_shell_executable to /bin/sh 15406 1726854945.08292: Set connection var ansible_pipelining to False 15406 1726854945.08295: variable 'ansible_shell_executable' from source: unknown 15406 1726854945.08298: variable 'ansible_connection' from source: unknown 15406 1726854945.08301: variable 'ansible_module_compression' from source: unknown 15406 1726854945.08303: variable 'ansible_shell_type' from source: unknown 15406 1726854945.08305: variable 'ansible_shell_executable' from source: unknown 15406 1726854945.08307: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854945.08309: variable 'ansible_pipelining' from source: unknown 15406 1726854945.08311: variable 'ansible_timeout' from source: unknown 15406 1726854945.08313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854945.08316: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854945.08325: variable 'omit' from source: magic vars 15406 1726854945.08327: starting attempt loop 15406 1726854945.08329: running the handler 15406 1726854945.08331: _low_level_execute_command(): starting 15406 1726854945.08333: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854945.09014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854945.09046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.09058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854945.09107: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.09150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854945.09161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854945.09179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854945.09296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854945.11012: stdout chunk (state=3): >>>/root <<< 15406 1726854945.11294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854945.11297: stderr chunk (state=3): >>><<< 15406 1726854945.11299: stdout chunk (state=3): >>><<< 15406 1726854945.11302: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854945.11304: _low_level_execute_command(): starting 15406 1726854945.11307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266 `" && echo ansible-tmp-1726854945.1118634-16079-31557027058266="` echo /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266 `" ) && sleep 0' 15406 1726854945.11839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854945.11861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854945.11872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854945.11902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.11970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.12012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854945.12024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854945.12033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854945.12133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854945.14082: stdout chunk (state=3): >>>ansible-tmp-1726854945.1118634-16079-31557027058266=/root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266 <<< 15406 1726854945.14256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854945.14260: stderr chunk (state=3): >>><<< 15406 1726854945.14265: stdout chunk (state=3): >>><<< 15406 1726854945.14521: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854945.1118634-16079-31557027058266=/root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854945.14524: variable 'ansible_module_compression' from source: unknown 15406 1726854945.14527: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15406 1726854945.14529: ANSIBALLZ: Acquiring lock 15406 1726854945.14533: ANSIBALLZ: Lock acquired: 140626829994624 15406 1726854945.14536: ANSIBALLZ: Creating module 15406 1726854945.37500: ANSIBALLZ: Writing module into payload 15406 1726854945.37860: ANSIBALLZ: Writing module 15406 1726854945.37890: ANSIBALLZ: Renaming module 15406 1726854945.37943: ANSIBALLZ: Done creating module 15406 1726854945.37946: variable 'ansible_facts' from source: unknown 15406 1726854945.38067: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py 15406 1726854945.38220: Sending initial data 15406 1726854945.38227: Sent initial data (167 bytes) 15406 1726854945.38846: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854945.38936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.38940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854945.38981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854945.38999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854945.39134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854945.40802: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 15406 1726854945.40808: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854945.40872: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854945.40937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp0a0n8u09 /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py <<< 15406 1726854945.40943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py" <<< 15406 1726854945.41023: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp0a0n8u09" to remote "/root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py" <<< 15406 1726854945.42189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854945.42249: stderr chunk (state=3): >>><<< 15406 1726854945.42252: stdout chunk (state=3): >>><<< 15406 1726854945.42270: done transferring module to remote 15406 1726854945.42305: _low_level_execute_command(): starting 15406 1726854945.42308: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/ /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py && sleep 0' 15406 1726854945.42894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854945.42898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854945.42901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854945.42906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854945.42910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.42954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854945.42977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854945.43072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854945.44869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854945.44893: stderr chunk (state=3): >>><<< 15406 1726854945.44897: stdout chunk (state=3): >>><<< 15406 1726854945.44914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854945.44917: _low_level_execute_command(): starting 15406 1726854945.44919: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/AnsiballZ_network_connections.py && sleep 0' 15406 1726854945.45884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854945.45925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854945.45941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854945.46099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854945.76586: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15406 1726854945.78638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854945.78643: stdout chunk (state=3): >>><<< 15406 1726854945.78645: stderr chunk (state=3): >>><<< 15406 1726854945.78658: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854945.78746: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854945.78750: _low_level_execute_command(): starting 15406 1726854945.78752: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854945.1118634-16079-31557027058266/ > /dev/null 2>&1 && sleep 0' 15406 1726854945.79367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854945.79401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854945.79509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854945.79536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854945.79653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854945.81549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854945.81578: stderr chunk (state=3): >>><<< 15406 1726854945.81581: stdout chunk (state=3): >>><<< 15406 1726854945.81598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854945.81606: handler run complete 15406 1726854945.81626: attempt loop complete, returning result 15406 1726854945.81629: _execute() done 15406 1726854945.81631: dumping result to json 15406 1726854945.81637: done dumping result, returning 15406 1726854945.81644: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-3c83-32d3-000000000024] 15406 1726854945.81664: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000024 15406 1726854945.81976: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000024 15406 1726854945.81980: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff (not-active) 15406 1726854945.82092: no more pending results, returning what we have 15406 1726854945.82095: results queue empty 15406 1726854945.82097: checking for any_errors_fatal 15406 1726854945.82106: done checking for any_errors_fatal 15406 1726854945.82107: checking for max_fail_percentage 15406 1726854945.82109: done checking for max_fail_percentage 15406 1726854945.82109: checking to see if all hosts have failed and the running result is not ok 15406 1726854945.82110: done checking to see if all hosts have failed 15406 1726854945.82111: getting the remaining hosts for this loop 15406 1726854945.82112: done getting the remaining hosts for this loop 15406 1726854945.82116: getting the next task for host managed_node2 15406 1726854945.82122: done getting next task for host managed_node2 15406 1726854945.82125: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15406 1726854945.82127: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854945.82137: getting variables 15406 1726854945.82138: in VariableManager get_vars() 15406 1726854945.82173: Calling all_inventory to load vars for managed_node2 15406 1726854945.82176: Calling groups_inventory to load vars for managed_node2 15406 1726854945.82179: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854945.82308: Calling all_plugins_play to load vars for managed_node2 15406 1726854945.82313: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854945.82317: Calling groups_plugins_play to load vars for managed_node2 15406 1726854945.83277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854945.84143: done with get_vars() 15406 1726854945.84161: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:55:45 -0400 (0:00:00.948) 0:00:13.665 ****** 15406 1726854945.84225: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15406 1726854945.84226: Creating lock for fedora.linux_system_roles.network_state 15406 1726854945.84461: worker is 1 (out of 1 available) 15406 1726854945.84505: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15406 1726854945.84518: done queuing things up, now waiting for results queue to drain 15406 1726854945.84522: waiting for pending results... 15406 1726854945.84783: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 15406 1726854945.84860: in run() - task 0affcc66-ac2b-3c83-32d3-000000000025 15406 1726854945.84871: variable 'ansible_search_path' from source: unknown 15406 1726854945.84875: variable 'ansible_search_path' from source: unknown 15406 1726854945.84906: calling self._execute() 15406 1726854945.84970: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854945.84975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854945.84982: variable 'omit' from source: magic vars 15406 1726854945.85697: variable 'ansible_distribution_major_version' from source: facts 15406 1726854945.85703: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854945.85750: variable 'network_state' from source: role '' defaults 15406 1726854945.85761: Evaluated conditional (network_state != {}): False 15406 1726854945.85764: when evaluation is False, skipping this task 15406 1726854945.85766: _execute() done 15406 1726854945.85768: dumping result to json 15406 1726854945.85771: done dumping result, returning 15406 1726854945.85806: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-3c83-32d3-000000000025] 15406 1726854945.85809: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000025 15406 1726854945.86071: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000025 15406 1726854945.86075: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854945.86297: no more pending results, returning what we have 15406 1726854945.86301: results queue empty 15406 1726854945.86302: checking for any_errors_fatal 15406 1726854945.86310: done checking for any_errors_fatal 15406 1726854945.86310: checking for max_fail_percentage 15406 1726854945.86312: done checking for max_fail_percentage 15406 1726854945.86313: checking to see if all hosts have failed and the running result is not ok 15406 1726854945.86314: done checking to see if all hosts have failed 15406 1726854945.86314: getting the remaining hosts for this loop 15406 1726854945.86315: done getting the remaining hosts for this loop 15406 1726854945.86319: getting the next task for host managed_node2 15406 1726854945.86325: done getting next task for host managed_node2 15406 1726854945.86328: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15406 1726854945.86331: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854945.86344: getting variables 15406 1726854945.86346: in VariableManager get_vars() 15406 1726854945.86380: Calling all_inventory to load vars for managed_node2 15406 1726854945.86382: Calling groups_inventory to load vars for managed_node2 15406 1726854945.86384: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854945.86396: Calling all_plugins_play to load vars for managed_node2 15406 1726854945.86402: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854945.86405: Calling groups_plugins_play to load vars for managed_node2 15406 1726854945.89508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854945.91505: done with get_vars() 15406 1726854945.91527: done getting variables 15406 1726854945.91637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:55:45 -0400 (0:00:00.074) 0:00:13.739 ****** 15406 1726854945.91710: entering _queue_task() for managed_node2/debug 15406 1726854945.92093: worker is 1 (out of 1 available) 15406 1726854945.92110: exiting _queue_task() for managed_node2/debug 15406 1726854945.92123: done queuing things up, now waiting for results queue to drain 15406 1726854945.92125: waiting for pending results... 15406 1726854945.92914: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15406 1726854945.93213: in run() - task 0affcc66-ac2b-3c83-32d3-000000000026 15406 1726854945.93220: variable 'ansible_search_path' from source: unknown 15406 1726854945.93224: variable 'ansible_search_path' from source: unknown 15406 1726854945.93239: calling self._execute() 15406 1726854945.93497: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854945.93501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854945.93508: variable 'omit' from source: magic vars 15406 1726854945.94300: variable 'ansible_distribution_major_version' from source: facts 15406 1726854945.94316: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854945.94493: variable 'omit' from source: magic vars 15406 1726854945.94496: variable 'omit' from source: magic vars 15406 1726854945.94498: variable 'omit' from source: magic vars 15406 1726854945.94598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854945.94634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854945.94713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854945.94734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854945.94775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854945.94908: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854945.94916: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854945.94995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854945.95132: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854945.95144: Set connection var ansible_timeout to 10 15406 1726854945.95150: Set connection var ansible_connection to ssh 15406 1726854945.95159: Set connection var ansible_shell_type to sh 15406 1726854945.95167: Set connection var ansible_shell_executable to /bin/sh 15406 1726854945.95177: Set connection var ansible_pipelining to False 15406 1726854945.95210: variable 'ansible_shell_executable' from source: unknown 15406 1726854945.95218: variable 'ansible_connection' from source: unknown 15406 1726854945.95273: variable 'ansible_module_compression' from source: unknown 15406 1726854945.95280: variable 'ansible_shell_type' from source: unknown 15406 1726854945.95286: variable 'ansible_shell_executable' from source: unknown 15406 1726854945.95296: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854945.95303: variable 'ansible_pipelining' from source: unknown 15406 1726854945.95310: variable 'ansible_timeout' from source: unknown 15406 1726854945.95321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854945.95695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854945.95699: variable 'omit' from source: magic vars 15406 1726854945.95701: starting attempt loop 15406 1726854945.95704: running the handler 15406 1726854945.95854: variable '__network_connections_result' from source: set_fact 15406 1726854945.95941: handler run complete 15406 1726854945.95964: attempt loop complete, returning result 15406 1726854945.95970: _execute() done 15406 1726854945.95977: dumping result to json 15406 1726854945.95991: done dumping result, returning 15406 1726854945.96004: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-3c83-32d3-000000000026] 15406 1726854945.96012: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000026 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff (not-active)" ] } 15406 1726854945.96248: no more pending results, returning what we have 15406 1726854945.96252: results queue empty 15406 1726854945.96253: checking for any_errors_fatal 15406 1726854945.96259: done checking for any_errors_fatal 15406 1726854945.96260: checking for max_fail_percentage 15406 1726854945.96266: done checking for max_fail_percentage 15406 1726854945.96267: checking to see if all hosts have failed and the running result is not ok 15406 1726854945.96268: done checking to see if all hosts have failed 15406 1726854945.96269: getting the remaining hosts for this loop 15406 1726854945.96270: done getting the remaining hosts for this loop 15406 1726854945.96274: getting the next task for host managed_node2 15406 1726854945.96281: done getting next task for host managed_node2 15406 1726854945.96284: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15406 1726854945.96286: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854945.96304: getting variables 15406 1726854945.96306: in VariableManager get_vars() 15406 1726854945.96339: Calling all_inventory to load vars for managed_node2 15406 1726854945.96341: Calling groups_inventory to load vars for managed_node2 15406 1726854945.96343: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854945.96353: Calling all_plugins_play to load vars for managed_node2 15406 1726854945.96356: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854945.96359: Calling groups_plugins_play to load vars for managed_node2 15406 1726854945.97683: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000026 15406 1726854945.98195: WORKER PROCESS EXITING 15406 1726854945.99293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854946.01014: done with get_vars() 15406 1726854946.01034: done getting variables 15406 1726854946.01112: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:55:46 -0400 (0:00:00.094) 0:00:13.834 ****** 15406 1726854946.01153: entering _queue_task() for managed_node2/debug 15406 1726854946.01508: worker is 1 (out of 1 available) 15406 1726854946.01698: exiting _queue_task() for managed_node2/debug 15406 1726854946.01709: done queuing things up, now waiting for results queue to drain 15406 1726854946.01710: waiting for pending results... 15406 1726854946.01810: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15406 1726854946.01912: in run() - task 0affcc66-ac2b-3c83-32d3-000000000027 15406 1726854946.02028: variable 'ansible_search_path' from source: unknown 15406 1726854946.02110: variable 'ansible_search_path' from source: unknown 15406 1726854946.02249: calling self._execute() 15406 1726854946.02362: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.02431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.02507: variable 'omit' from source: magic vars 15406 1726854946.03343: variable 'ansible_distribution_major_version' from source: facts 15406 1726854946.03347: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854946.03349: variable 'omit' from source: magic vars 15406 1726854946.03495: variable 'omit' from source: magic vars 15406 1726854946.03568: variable 'omit' from source: magic vars 15406 1726854946.03778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854946.03782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854946.03795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854946.03816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854946.03838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854946.03873: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854946.03937: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.03955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.04176: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854946.04189: Set connection var ansible_timeout to 10 15406 1726854946.04216: Set connection var ansible_connection to ssh 15406 1726854946.04227: Set connection var ansible_shell_type to sh 15406 1726854946.04295: Set connection var ansible_shell_executable to /bin/sh 15406 1726854946.04308: Set connection var ansible_pipelining to False 15406 1726854946.04345: variable 'ansible_shell_executable' from source: unknown 15406 1726854946.04353: variable 'ansible_connection' from source: unknown 15406 1726854946.04359: variable 'ansible_module_compression' from source: unknown 15406 1726854946.04398: variable 'ansible_shell_type' from source: unknown 15406 1726854946.04404: variable 'ansible_shell_executable' from source: unknown 15406 1726854946.04410: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.04417: variable 'ansible_pipelining' from source: unknown 15406 1726854946.04430: variable 'ansible_timeout' from source: unknown 15406 1726854946.04437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.04698: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854946.04770: variable 'omit' from source: magic vars 15406 1726854946.04780: starting attempt loop 15406 1726854946.04976: running the handler 15406 1726854946.04979: variable '__network_connections_result' from source: set_fact 15406 1726854946.05128: variable '__network_connections_result' from source: set_fact 15406 1726854946.05494: handler run complete 15406 1726854946.05497: attempt loop complete, returning result 15406 1726854946.05499: _execute() done 15406 1726854946.05501: dumping result to json 15406 1726854946.05503: done dumping result, returning 15406 1726854946.05505: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-3c83-32d3-000000000027] 15406 1726854946.05506: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000027 15406 1726854946.05572: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000027 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 2f13cc6b-f32a-48d6-a24a-0c29170576ff (not-active)" ] } } 15406 1726854946.05657: no more pending results, returning what we have 15406 1726854946.05660: results queue empty 15406 1726854946.05662: checking for any_errors_fatal 15406 1726854946.05670: done checking for any_errors_fatal 15406 1726854946.05671: checking for max_fail_percentage 15406 1726854946.05672: done checking for max_fail_percentage 15406 1726854946.05673: checking to see if all hosts have failed and the running result is not ok 15406 1726854946.05675: done checking to see if all hosts have failed 15406 1726854946.05675: getting the remaining hosts for this loop 15406 1726854946.05677: done getting the remaining hosts for this loop 15406 1726854946.05680: getting the next task for host managed_node2 15406 1726854946.05798: done getting next task for host managed_node2 15406 1726854946.05802: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15406 1726854946.05804: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854946.05814: getting variables 15406 1726854946.05816: in VariableManager get_vars() 15406 1726854946.05851: Calling all_inventory to load vars for managed_node2 15406 1726854946.05854: Calling groups_inventory to load vars for managed_node2 15406 1726854946.05857: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854946.05867: Calling all_plugins_play to load vars for managed_node2 15406 1726854946.05870: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854946.05872: Calling groups_plugins_play to load vars for managed_node2 15406 1726854946.06648: WORKER PROCESS EXITING 15406 1726854946.09443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854946.12418: done with get_vars() 15406 1726854946.12449: done getting variables 15406 1726854946.12525: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:55:46 -0400 (0:00:00.114) 0:00:13.948 ****** 15406 1726854946.12559: entering _queue_task() for managed_node2/debug 15406 1726854946.12921: worker is 1 (out of 1 available) 15406 1726854946.12933: exiting _queue_task() for managed_node2/debug 15406 1726854946.12945: done queuing things up, now waiting for results queue to drain 15406 1726854946.12947: waiting for pending results... 15406 1726854946.13306: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15406 1726854946.13336: in run() - task 0affcc66-ac2b-3c83-32d3-000000000028 15406 1726854946.13350: variable 'ansible_search_path' from source: unknown 15406 1726854946.13353: variable 'ansible_search_path' from source: unknown 15406 1726854946.13694: calling self._execute() 15406 1726854946.13697: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.13700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.13704: variable 'omit' from source: magic vars 15406 1726854946.13871: variable 'ansible_distribution_major_version' from source: facts 15406 1726854946.13883: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854946.14016: variable 'network_state' from source: role '' defaults 15406 1726854946.14026: Evaluated conditional (network_state != {}): False 15406 1726854946.14029: when evaluation is False, skipping this task 15406 1726854946.14032: _execute() done 15406 1726854946.14034: dumping result to json 15406 1726854946.14037: done dumping result, returning 15406 1726854946.14051: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-3c83-32d3-000000000028] 15406 1726854946.14056: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000028 15406 1726854946.14140: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000028 15406 1726854946.14143: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 15406 1726854946.14201: no more pending results, returning what we have 15406 1726854946.14207: results queue empty 15406 1726854946.14208: checking for any_errors_fatal 15406 1726854946.14219: done checking for any_errors_fatal 15406 1726854946.14220: checking for max_fail_percentage 15406 1726854946.14222: done checking for max_fail_percentage 15406 1726854946.14223: checking to see if all hosts have failed and the running result is not ok 15406 1726854946.14224: done checking to see if all hosts have failed 15406 1726854946.14225: getting the remaining hosts for this loop 15406 1726854946.14226: done getting the remaining hosts for this loop 15406 1726854946.14231: getting the next task for host managed_node2 15406 1726854946.14239: done getting next task for host managed_node2 15406 1726854946.14243: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15406 1726854946.14246: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854946.14262: getting variables 15406 1726854946.14264: in VariableManager get_vars() 15406 1726854946.14309: Calling all_inventory to load vars for managed_node2 15406 1726854946.14312: Calling groups_inventory to load vars for managed_node2 15406 1726854946.14315: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854946.14327: Calling all_plugins_play to load vars for managed_node2 15406 1726854946.14330: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854946.14332: Calling groups_plugins_play to load vars for managed_node2 15406 1726854946.15941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854946.17576: done with get_vars() 15406 1726854946.17606: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:55:46 -0400 (0:00:00.051) 0:00:13.999 ****** 15406 1726854946.17708: entering _queue_task() for managed_node2/ping 15406 1726854946.17710: Creating lock for ping 15406 1726854946.18067: worker is 1 (out of 1 available) 15406 1726854946.18198: exiting _queue_task() for managed_node2/ping 15406 1726854946.18210: done queuing things up, now waiting for results queue to drain 15406 1726854946.18212: waiting for pending results... 15406 1726854946.18386: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15406 1726854946.18479: in run() - task 0affcc66-ac2b-3c83-32d3-000000000029 15406 1726854946.18492: variable 'ansible_search_path' from source: unknown 15406 1726854946.18498: variable 'ansible_search_path' from source: unknown 15406 1726854946.18537: calling self._execute() 15406 1726854946.18697: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.18701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.18703: variable 'omit' from source: magic vars 15406 1726854946.19148: variable 'ansible_distribution_major_version' from source: facts 15406 1726854946.19162: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854946.19165: variable 'omit' from source: magic vars 15406 1726854946.19422: variable 'omit' from source: magic vars 15406 1726854946.19457: variable 'omit' from source: magic vars 15406 1726854946.19502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854946.19653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854946.19673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854946.19694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854946.19994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854946.19998: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854946.20000: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.20002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.20119: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854946.20124: Set connection var ansible_timeout to 10 15406 1726854946.20127: Set connection var ansible_connection to ssh 15406 1726854946.20132: Set connection var ansible_shell_type to sh 15406 1726854946.20137: Set connection var ansible_shell_executable to /bin/sh 15406 1726854946.20144: Set connection var ansible_pipelining to False 15406 1726854946.20285: variable 'ansible_shell_executable' from source: unknown 15406 1726854946.20290: variable 'ansible_connection' from source: unknown 15406 1726854946.20320: variable 'ansible_module_compression' from source: unknown 15406 1726854946.20323: variable 'ansible_shell_type' from source: unknown 15406 1726854946.20326: variable 'ansible_shell_executable' from source: unknown 15406 1726854946.20328: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854946.20330: variable 'ansible_pipelining' from source: unknown 15406 1726854946.20332: variable 'ansible_timeout' from source: unknown 15406 1726854946.20334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854946.20723: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854946.20733: variable 'omit' from source: magic vars 15406 1726854946.20737: starting attempt loop 15406 1726854946.20740: running the handler 15406 1726854946.20755: _low_level_execute_command(): starting 15406 1726854946.20758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854946.22112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854946.22117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854946.22277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854946.22281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854946.22422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854946.22579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854946.24232: stdout chunk (state=3): >>>/root <<< 15406 1726854946.24393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854946.24396: stderr chunk (state=3): >>><<< 15406 1726854946.24399: stdout chunk (state=3): >>><<< 15406 1726854946.24466: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854946.24480: _low_level_execute_command(): starting 15406 1726854946.24486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658 `" && echo ansible-tmp-1726854946.2446632-16117-159364333183658="` echo /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658 `" ) && sleep 0' 15406 1726854946.25362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854946.25377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854946.25396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854946.25412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854946.25433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854946.25444: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854946.25503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854946.25607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854946.25626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854946.25984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854946.26085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854946.28233: stdout chunk (state=3): >>>ansible-tmp-1726854946.2446632-16117-159364333183658=/root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658 <<< 15406 1726854946.28340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854946.28344: stdout chunk (state=3): >>><<< 15406 1726854946.28346: stderr chunk (state=3): >>><<< 15406 1726854946.28348: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854946.2446632-16117-159364333183658=/root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854946.28351: variable 'ansible_module_compression' from source: unknown 15406 1726854946.28595: ANSIBALLZ: Using lock for ping 15406 1726854946.28601: ANSIBALLZ: Acquiring lock 15406 1726854946.28609: ANSIBALLZ: Lock acquired: 140626829998656 15406 1726854946.28617: ANSIBALLZ: Creating module 15406 1726854946.58015: ANSIBALLZ: Writing module into payload 15406 1726854946.58080: ANSIBALLZ: Writing module 15406 1726854946.58224: ANSIBALLZ: Renaming module 15406 1726854946.58237: ANSIBALLZ: Done creating module 15406 1726854946.58258: variable 'ansible_facts' from source: unknown 15406 1726854946.58362: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py 15406 1726854946.59317: Sending initial data 15406 1726854946.59320: Sent initial data (153 bytes) 15406 1726854946.60373: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854946.60555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854946.60600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854946.60663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854946.60713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854946.60985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854946.62664: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854946.62733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854946.62916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpks567zh2 /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py <<< 15406 1726854946.62919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py" <<< 15406 1726854946.62948: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpks567zh2" to remote "/root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py" <<< 15406 1726854946.65001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854946.65005: stdout chunk (state=3): >>><<< 15406 1726854946.65008: stderr chunk (state=3): >>><<< 15406 1726854946.65010: done transferring module to remote 15406 1726854946.65012: _low_level_execute_command(): starting 15406 1726854946.65015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/ /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py && sleep 0' 15406 1726854946.66148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854946.66161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854946.66175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854946.66195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854946.66304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854946.66382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854946.66462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854946.66519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854946.66621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854946.68417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854946.68514: stderr chunk (state=3): >>><<< 15406 1726854946.68620: stdout chunk (state=3): >>><<< 15406 1726854946.68625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854946.68628: _low_level_execute_command(): starting 15406 1726854946.68630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/AnsiballZ_ping.py && sleep 0' 15406 1726854946.69870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854946.69886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854946.69906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854946.69924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854946.70082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854946.70171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854946.70202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854946.70264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854946.70497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854946.85222: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15406 1726854946.86699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854946.86704: stdout chunk (state=3): >>><<< 15406 1726854946.86707: stderr chunk (state=3): >>><<< 15406 1726854946.86710: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854946.86713: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854946.86715: _low_level_execute_command(): starting 15406 1726854946.86718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854946.2446632-16117-159364333183658/ > /dev/null 2>&1 && sleep 0' 15406 1726854946.87709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854946.87804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854946.87827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854946.87838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854946.87940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854946.89967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854946.89970: stdout chunk (state=3): >>><<< 15406 1726854946.89977: stderr chunk (state=3): >>><<< 15406 1726854946.90000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854946.90007: handler run complete 15406 1726854946.90050: attempt loop complete, returning result 15406 1726854946.90053: _execute() done 15406 1726854946.90055: dumping result to json 15406 1726854946.90057: done dumping result, returning 15406 1726854946.90059: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-3c83-32d3-000000000029] 15406 1726854946.90061: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000029 ok: [managed_node2] => { "changed": false, "ping": "pong" } 15406 1726854946.90202: no more pending results, returning what we have 15406 1726854946.90206: results queue empty 15406 1726854946.90207: checking for any_errors_fatal 15406 1726854946.90214: done checking for any_errors_fatal 15406 1726854946.90214: checking for max_fail_percentage 15406 1726854946.90216: done checking for max_fail_percentage 15406 1726854946.90217: checking to see if all hosts have failed and the running result is not ok 15406 1726854946.90218: done checking to see if all hosts have failed 15406 1726854946.90218: getting the remaining hosts for this loop 15406 1726854946.90220: done getting the remaining hosts for this loop 15406 1726854946.90223: getting the next task for host managed_node2 15406 1726854946.90230: done getting next task for host managed_node2 15406 1726854946.90232: ^ task is: TASK: meta (role_complete) 15406 1726854946.90234: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854946.90242: getting variables 15406 1726854946.90244: in VariableManager get_vars() 15406 1726854946.90281: Calling all_inventory to load vars for managed_node2 15406 1726854946.90283: Calling groups_inventory to load vars for managed_node2 15406 1726854946.90285: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854946.90595: Calling all_plugins_play to load vars for managed_node2 15406 1726854946.90599: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854946.90603: Calling groups_plugins_play to load vars for managed_node2 15406 1726854946.91721: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000029 15406 1726854946.91725: WORKER PROCESS EXITING 15406 1726854946.93417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854946.96636: done with get_vars() 15406 1726854946.96666: done getting variables 15406 1726854946.96958: done queuing things up, now waiting for results queue to drain 15406 1726854946.96961: results queue empty 15406 1726854946.96961: checking for any_errors_fatal 15406 1726854946.96964: done checking for any_errors_fatal 15406 1726854946.96965: checking for max_fail_percentage 15406 1726854946.96966: done checking for max_fail_percentage 15406 1726854946.96967: checking to see if all hosts have failed and the running result is not ok 15406 1726854946.96968: done checking to see if all hosts have failed 15406 1726854946.96968: getting the remaining hosts for this loop 15406 1726854946.96969: done getting the remaining hosts for this loop 15406 1726854946.96972: getting the next task for host managed_node2 15406 1726854946.96977: done getting next task for host managed_node2 15406 1726854946.96978: ^ task is: TASK: meta (flush_handlers) 15406 1726854946.96979: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854946.96982: getting variables 15406 1726854946.96983: in VariableManager get_vars() 15406 1726854946.97002: Calling all_inventory to load vars for managed_node2 15406 1726854946.97004: Calling groups_inventory to load vars for managed_node2 15406 1726854946.97006: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854946.97012: Calling all_plugins_play to load vars for managed_node2 15406 1726854946.97014: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854946.97017: Calling groups_plugins_play to load vars for managed_node2 15406 1726854946.99095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854947.00833: done with get_vars() 15406 1726854947.00859: done getting variables 15406 1726854947.00917: in VariableManager get_vars() 15406 1726854947.00929: Calling all_inventory to load vars for managed_node2 15406 1726854947.00932: Calling groups_inventory to load vars for managed_node2 15406 1726854947.00934: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854947.00939: Calling all_plugins_play to load vars for managed_node2 15406 1726854947.00941: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854947.00944: Calling groups_plugins_play to load vars for managed_node2 15406 1726854947.02156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854947.03748: done with get_vars() 15406 1726854947.03786: done queuing things up, now waiting for results queue to drain 15406 1726854947.03792: results queue empty 15406 1726854947.03793: checking for any_errors_fatal 15406 1726854947.03795: done checking for any_errors_fatal 15406 1726854947.03795: checking for max_fail_percentage 15406 1726854947.03796: done checking for max_fail_percentage 15406 1726854947.03797: checking to see if all hosts have failed and the running result is not ok 15406 1726854947.03798: done checking to see if all hosts have failed 15406 1726854947.03799: getting the remaining hosts for this loop 15406 1726854947.03800: done getting the remaining hosts for this loop 15406 1726854947.03803: getting the next task for host managed_node2 15406 1726854947.03807: done getting next task for host managed_node2 15406 1726854947.03809: ^ task is: TASK: meta (flush_handlers) 15406 1726854947.03811: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854947.03814: getting variables 15406 1726854947.03815: in VariableManager get_vars() 15406 1726854947.03828: Calling all_inventory to load vars for managed_node2 15406 1726854947.03830: Calling groups_inventory to load vars for managed_node2 15406 1726854947.03836: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854947.03842: Calling all_plugins_play to load vars for managed_node2 15406 1726854947.03844: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854947.03847: Calling groups_plugins_play to load vars for managed_node2 15406 1726854947.05172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854947.08504: done with get_vars() 15406 1726854947.08532: done getting variables 15406 1726854947.08716: in VariableManager get_vars() 15406 1726854947.08730: Calling all_inventory to load vars for managed_node2 15406 1726854947.08733: Calling groups_inventory to load vars for managed_node2 15406 1726854947.08735: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854947.08740: Calling all_plugins_play to load vars for managed_node2 15406 1726854947.08742: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854947.08744: Calling groups_plugins_play to load vars for managed_node2 15406 1726854947.11374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854947.14771: done with get_vars() 15406 1726854947.14808: done queuing things up, now waiting for results queue to drain 15406 1726854947.14810: results queue empty 15406 1726854947.14811: checking for any_errors_fatal 15406 1726854947.14812: done checking for any_errors_fatal 15406 1726854947.14813: checking for max_fail_percentage 15406 1726854947.14814: done checking for max_fail_percentage 15406 1726854947.14815: checking to see if all hosts have failed and the running result is not ok 15406 1726854947.14815: done checking to see if all hosts have failed 15406 1726854947.14816: getting the remaining hosts for this loop 15406 1726854947.14817: done getting the remaining hosts for this loop 15406 1726854947.14819: getting the next task for host managed_node2 15406 1726854947.14823: done getting next task for host managed_node2 15406 1726854947.14823: ^ task is: None 15406 1726854947.14825: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854947.14826: done queuing things up, now waiting for results queue to drain 15406 1726854947.14826: results queue empty 15406 1726854947.14827: checking for any_errors_fatal 15406 1726854947.14828: done checking for any_errors_fatal 15406 1726854947.14828: checking for max_fail_percentage 15406 1726854947.14829: done checking for max_fail_percentage 15406 1726854947.14830: checking to see if all hosts have failed and the running result is not ok 15406 1726854947.14830: done checking to see if all hosts have failed 15406 1726854947.14831: getting the next task for host managed_node2 15406 1726854947.14833: done getting next task for host managed_node2 15406 1726854947.14834: ^ task is: None 15406 1726854947.14835: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854947.14998: in VariableManager get_vars() 15406 1726854947.15017: done with get_vars() 15406 1726854947.15023: in VariableManager get_vars() 15406 1726854947.15031: done with get_vars() 15406 1726854947.15037: variable 'omit' from source: magic vars 15406 1726854947.15273: variable 'task' from source: play vars 15406 1726854947.15425: in VariableManager get_vars() 15406 1726854947.15436: done with get_vars() 15406 1726854947.15454: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15406 1726854947.15993: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854947.16101: getting the remaining hosts for this loop 15406 1726854947.16103: done getting the remaining hosts for this loop 15406 1726854947.16105: getting the next task for host managed_node2 15406 1726854947.16108: done getting next task for host managed_node2 15406 1726854947.16110: ^ task is: TASK: Gathering Facts 15406 1726854947.16111: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854947.16113: getting variables 15406 1726854947.16114: in VariableManager get_vars() 15406 1726854947.16123: Calling all_inventory to load vars for managed_node2 15406 1726854947.16125: Calling groups_inventory to load vars for managed_node2 15406 1726854947.16127: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854947.16133: Calling all_plugins_play to load vars for managed_node2 15406 1726854947.16135: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854947.16138: Calling groups_plugins_play to load vars for managed_node2 15406 1726854947.18778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854947.22645: done with get_vars() 15406 1726854947.22675: done getting variables 15406 1726854947.22730: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:55:47 -0400 (0:00:01.050) 0:00:15.050 ****** 15406 1726854947.22758: entering _queue_task() for managed_node2/gather_facts 15406 1726854947.23414: worker is 1 (out of 1 available) 15406 1726854947.23425: exiting _queue_task() for managed_node2/gather_facts 15406 1726854947.23438: done queuing things up, now waiting for results queue to drain 15406 1726854947.23439: waiting for pending results... 15406 1726854947.24004: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854947.24245: in run() - task 0affcc66-ac2b-3c83-32d3-000000000219 15406 1726854947.24352: variable 'ansible_search_path' from source: unknown 15406 1726854947.24356: calling self._execute() 15406 1726854947.24402: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854947.24469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854947.24485: variable 'omit' from source: magic vars 15406 1726854947.25179: variable 'ansible_distribution_major_version' from source: facts 15406 1726854947.25492: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854947.25496: variable 'omit' from source: magic vars 15406 1726854947.25498: variable 'omit' from source: magic vars 15406 1726854947.25501: variable 'omit' from source: magic vars 15406 1726854947.25504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854947.25602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854947.25837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854947.25841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854947.25844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854947.25846: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854947.25849: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854947.25851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854947.26031: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854947.26063: Set connection var ansible_timeout to 10 15406 1726854947.26168: Set connection var ansible_connection to ssh 15406 1726854947.26180: Set connection var ansible_shell_type to sh 15406 1726854947.26192: Set connection var ansible_shell_executable to /bin/sh 15406 1726854947.26205: Set connection var ansible_pipelining to False 15406 1726854947.26235: variable 'ansible_shell_executable' from source: unknown 15406 1726854947.26243: variable 'ansible_connection' from source: unknown 15406 1726854947.26250: variable 'ansible_module_compression' from source: unknown 15406 1726854947.26258: variable 'ansible_shell_type' from source: unknown 15406 1726854947.26267: variable 'ansible_shell_executable' from source: unknown 15406 1726854947.26278: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854947.26289: variable 'ansible_pipelining' from source: unknown 15406 1726854947.26298: variable 'ansible_timeout' from source: unknown 15406 1726854947.26494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854947.26691: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854947.26896: variable 'omit' from source: magic vars 15406 1726854947.26900: starting attempt loop 15406 1726854947.26902: running the handler 15406 1726854947.26904: variable 'ansible_facts' from source: unknown 15406 1726854947.26906: _low_level_execute_command(): starting 15406 1726854947.26907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854947.28306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854947.28422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854947.28450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854947.28574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854947.30312: stdout chunk (state=3): >>>/root <<< 15406 1726854947.30430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854947.30530: stderr chunk (state=3): >>><<< 15406 1726854947.30533: stdout chunk (state=3): >>><<< 15406 1726854947.30621: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854947.30626: _low_level_execute_command(): starting 15406 1726854947.30629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282 `" && echo ansible-tmp-1726854947.3055246-16158-139256807768282="` echo /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282 `" ) && sleep 0' 15406 1726854947.31923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854947.31937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15406 1726854947.31965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854947.32033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854947.32104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854947.34019: stdout chunk (state=3): >>>ansible-tmp-1726854947.3055246-16158-139256807768282=/root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282 <<< 15406 1726854947.34123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854947.34156: stderr chunk (state=3): >>><<< 15406 1726854947.34461: stdout chunk (state=3): >>><<< 15406 1726854947.34466: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854947.3055246-16158-139256807768282=/root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854947.34469: variable 'ansible_module_compression' from source: unknown 15406 1726854947.34471: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854947.34560: variable 'ansible_facts' from source: unknown 15406 1726854947.35079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py 15406 1726854947.35376: Sending initial data 15406 1726854947.35386: Sent initial data (154 bytes) 15406 1726854947.36639: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854947.36652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854947.36667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854947.36910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854947.36914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854947.36968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854947.36971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854947.36996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854947.37094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854947.38751: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15406 1726854947.38766: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854947.38821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854947.38886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmph3gwgdad /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py <<< 15406 1726854947.38912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py" <<< 15406 1726854947.38990: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmph3gwgdad" to remote "/root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py" <<< 15406 1726854947.41745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854947.41801: stderr chunk (state=3): >>><<< 15406 1726854947.41994: stdout chunk (state=3): >>><<< 15406 1726854947.41997: done transferring module to remote 15406 1726854947.42000: _low_level_execute_command(): starting 15406 1726854947.42003: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/ /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py && sleep 0' 15406 1726854947.43760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854947.43775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854947.43788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854947.43858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854947.44052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854947.45825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854947.46053: stderr chunk (state=3): >>><<< 15406 1726854947.46057: stdout chunk (state=3): >>><<< 15406 1726854947.46060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854947.46062: _low_level_execute_command(): starting 15406 1726854947.46065: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/AnsiballZ_setup.py && sleep 0' 15406 1726854947.47224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854947.47269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854947.47287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854947.47314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854947.47366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854947.47434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854947.47453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854947.47473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854947.47697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.12834: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtot<<< 15406 1726854948.12903: stdout chunk (state=3): >>>al_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3305, "used": 226}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 731, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797134336, "block_size": 4096, "block_total": 65519099, "block_available": 63915316, "block_used": 1603783, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.42333984375, "5m": 0.36083984375, "15m": 0.1796875}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "46:8a:ac:06:3d:16", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_g<<< 15406 1726854948.12912: stdout chunk (state=3): >>>eneric": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "48", "epoch": "1726854948", "epoch_int": "1726854948", "date": "2024-09-20", "time": "13:55:48", "iso8601_micro": "2024-09-20T17:55:48.124106Z", "iso8601": "2024-09-20T17:55:48Z", "iso8601_basic": "20240920T135548124106", "iso8601_basic_short": "20240920T135548", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854948.14866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854948.14870: stdout chunk (state=3): >>><<< 15406 1726854948.14872: stderr chunk (state=3): >>><<< 15406 1726854948.14916: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3305, "used": 226}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 731, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797134336, "block_size": 4096, "block_total": 65519099, "block_available": 63915316, "block_used": 1603783, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.42333984375, "5m": 0.36083984375, "15m": 0.1796875}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "46:8a:ac:06:3d:16", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "48", "epoch": "1726854948", "epoch_int": "1726854948", "date": "2024-09-20", "time": "13:55:48", "iso8601_micro": "2024-09-20T17:55:48.124106Z", "iso8601": "2024-09-20T17:55:48Z", "iso8601_basic": "20240920T135548124106", "iso8601_basic_short": "20240920T135548", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854948.15397: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854948.15401: _low_level_execute_command(): starting 15406 1726854948.15405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854947.3055246-16158-139256807768282/ > /dev/null 2>&1 && sleep 0' 15406 1726854948.15999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854948.16015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854948.16029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854948.16050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854948.16077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854948.16094: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854948.16187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854948.16224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854948.16241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854948.16364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.18233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854948.18289: stderr chunk (state=3): >>><<< 15406 1726854948.18326: stdout chunk (state=3): >>><<< 15406 1726854948.18397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854948.18400: handler run complete 15406 1726854948.18565: variable 'ansible_facts' from source: unknown 15406 1726854948.18784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.19545: variable 'ansible_facts' from source: unknown 15406 1726854948.19837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.20213: attempt loop complete, returning result 15406 1726854948.20243: _execute() done 15406 1726854948.20274: dumping result to json 15406 1726854948.20373: done dumping result, returning 15406 1726854948.20409: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-000000000219] 15406 1726854948.20451: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000219 ok: [managed_node2] 15406 1726854948.21909: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000219 15406 1726854948.21914: WORKER PROCESS EXITING 15406 1726854948.22033: no more pending results, returning what we have 15406 1726854948.22036: results queue empty 15406 1726854948.22039: checking for any_errors_fatal 15406 1726854948.22040: done checking for any_errors_fatal 15406 1726854948.22041: checking for max_fail_percentage 15406 1726854948.22086: done checking for max_fail_percentage 15406 1726854948.22090: checking to see if all hosts have failed and the running result is not ok 15406 1726854948.22091: done checking to see if all hosts have failed 15406 1726854948.22115: getting the remaining hosts for this loop 15406 1726854948.22116: done getting the remaining hosts for this loop 15406 1726854948.22120: getting the next task for host managed_node2 15406 1726854948.22126: done getting next task for host managed_node2 15406 1726854948.22128: ^ task is: TASK: meta (flush_handlers) 15406 1726854948.22130: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854948.22164: getting variables 15406 1726854948.22166: in VariableManager get_vars() 15406 1726854948.22192: Calling all_inventory to load vars for managed_node2 15406 1726854948.22195: Calling groups_inventory to load vars for managed_node2 15406 1726854948.22198: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.22209: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.22212: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.22215: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.31065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.32742: done with get_vars() 15406 1726854948.32769: done getting variables 15406 1726854948.32829: in VariableManager get_vars() 15406 1726854948.32839: Calling all_inventory to load vars for managed_node2 15406 1726854948.32841: Calling groups_inventory to load vars for managed_node2 15406 1726854948.32843: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.32848: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.32850: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.32853: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.34078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.36877: done with get_vars() 15406 1726854948.36916: done queuing things up, now waiting for results queue to drain 15406 1726854948.36918: results queue empty 15406 1726854948.36919: checking for any_errors_fatal 15406 1726854948.36928: done checking for any_errors_fatal 15406 1726854948.36929: checking for max_fail_percentage 15406 1726854948.36930: done checking for max_fail_percentage 15406 1726854948.36930: checking to see if all hosts have failed and the running result is not ok 15406 1726854948.36931: done checking to see if all hosts have failed 15406 1726854948.36932: getting the remaining hosts for this loop 15406 1726854948.36933: done getting the remaining hosts for this loop 15406 1726854948.36936: getting the next task for host managed_node2 15406 1726854948.36940: done getting next task for host managed_node2 15406 1726854948.36942: ^ task is: TASK: Include the task '{{ task }}' 15406 1726854948.36944: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854948.36946: getting variables 15406 1726854948.36947: in VariableManager get_vars() 15406 1726854948.36958: Calling all_inventory to load vars for managed_node2 15406 1726854948.36960: Calling groups_inventory to load vars for managed_node2 15406 1726854948.36963: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.36969: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.36971: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.36974: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.39358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.42099: done with get_vars() 15406 1726854948.42122: done getting variables 15406 1726854948.42266: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:55:48 -0400 (0:00:01.195) 0:00:16.249 ****** 15406 1726854948.42699: entering _queue_task() for managed_node2/include_tasks 15406 1726854948.43340: worker is 1 (out of 1 available) 15406 1726854948.43353: exiting _queue_task() for managed_node2/include_tasks 15406 1726854948.43363: done queuing things up, now waiting for results queue to drain 15406 1726854948.43364: waiting for pending results... 15406 1726854948.43925: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_present.yml' 15406 1726854948.44041: in run() - task 0affcc66-ac2b-3c83-32d3-00000000002d 15406 1726854948.44061: variable 'ansible_search_path' from source: unknown 15406 1726854948.44349: calling self._execute() 15406 1726854948.44352: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854948.44355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854948.44357: variable 'omit' from source: magic vars 15406 1726854948.45088: variable 'ansible_distribution_major_version' from source: facts 15406 1726854948.45292: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854948.45296: variable 'task' from source: play vars 15406 1726854948.45298: variable 'task' from source: play vars 15406 1726854948.45544: _execute() done 15406 1726854948.45548: dumping result to json 15406 1726854948.45550: done dumping result, returning 15406 1726854948.45553: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_present.yml' [0affcc66-ac2b-3c83-32d3-00000000002d] 15406 1726854948.45555: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000002d 15406 1726854948.45636: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000002d 15406 1726854948.45640: WORKER PROCESS EXITING 15406 1726854948.45669: no more pending results, returning what we have 15406 1726854948.45675: in VariableManager get_vars() 15406 1726854948.45712: Calling all_inventory to load vars for managed_node2 15406 1726854948.45715: Calling groups_inventory to load vars for managed_node2 15406 1726854948.45719: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.45733: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.45735: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.45738: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.47693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.50215: done with get_vars() 15406 1726854948.50239: variable 'ansible_search_path' from source: unknown 15406 1726854948.50255: we have included files to process 15406 1726854948.50257: generating all_blocks data 15406 1726854948.50258: done generating all_blocks data 15406 1726854948.50259: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15406 1726854948.50260: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15406 1726854948.50263: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15406 1726854948.50443: in VariableManager get_vars() 15406 1726854948.50460: done with get_vars() 15406 1726854948.50580: done processing included file 15406 1726854948.50582: iterating over new_blocks loaded from include file 15406 1726854948.50584: in VariableManager get_vars() 15406 1726854948.50601: done with get_vars() 15406 1726854948.50603: filtering new block on tags 15406 1726854948.50622: done filtering new block on tags 15406 1726854948.50624: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 15406 1726854948.50630: extending task lists for all hosts with included blocks 15406 1726854948.50666: done extending task lists 15406 1726854948.50667: done processing included files 15406 1726854948.50668: results queue empty 15406 1726854948.50669: checking for any_errors_fatal 15406 1726854948.50670: done checking for any_errors_fatal 15406 1726854948.50671: checking for max_fail_percentage 15406 1726854948.50672: done checking for max_fail_percentage 15406 1726854948.50673: checking to see if all hosts have failed and the running result is not ok 15406 1726854948.50674: done checking to see if all hosts have failed 15406 1726854948.50674: getting the remaining hosts for this loop 15406 1726854948.50675: done getting the remaining hosts for this loop 15406 1726854948.50679: getting the next task for host managed_node2 15406 1726854948.50682: done getting next task for host managed_node2 15406 1726854948.50685: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15406 1726854948.50691: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854948.50694: getting variables 15406 1726854948.50695: in VariableManager get_vars() 15406 1726854948.50703: Calling all_inventory to load vars for managed_node2 15406 1726854948.50705: Calling groups_inventory to load vars for managed_node2 15406 1726854948.50707: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.50712: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.50714: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.50717: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.52319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.54978: done with get_vars() 15406 1726854948.55010: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:55:48 -0400 (0:00:00.123) 0:00:16.373 ****** 15406 1726854948.55084: entering _queue_task() for managed_node2/include_tasks 15406 1726854948.55440: worker is 1 (out of 1 available) 15406 1726854948.55452: exiting _queue_task() for managed_node2/include_tasks 15406 1726854948.55464: done queuing things up, now waiting for results queue to drain 15406 1726854948.55466: waiting for pending results... 15406 1726854948.56074: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 15406 1726854948.56512: in run() - task 0affcc66-ac2b-3c83-32d3-00000000022a 15406 1726854948.56515: variable 'ansible_search_path' from source: unknown 15406 1726854948.56518: variable 'ansible_search_path' from source: unknown 15406 1726854948.56544: calling self._execute() 15406 1726854948.56754: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854948.56799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854948.56826: variable 'omit' from source: magic vars 15406 1726854948.57219: variable 'ansible_distribution_major_version' from source: facts 15406 1726854948.57223: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854948.57229: _execute() done 15406 1726854948.57236: dumping result to json 15406 1726854948.57328: done dumping result, returning 15406 1726854948.57332: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-3c83-32d3-00000000022a] 15406 1726854948.57335: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000022a 15406 1726854948.57410: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000022a 15406 1726854948.57413: WORKER PROCESS EXITING 15406 1726854948.57456: no more pending results, returning what we have 15406 1726854948.57462: in VariableManager get_vars() 15406 1726854948.57500: Calling all_inventory to load vars for managed_node2 15406 1726854948.57502: Calling groups_inventory to load vars for managed_node2 15406 1726854948.57506: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.57519: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.57522: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.57524: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.59156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.61197: done with get_vars() 15406 1726854948.61215: variable 'ansible_search_path' from source: unknown 15406 1726854948.61217: variable 'ansible_search_path' from source: unknown 15406 1726854948.61227: variable 'task' from source: play vars 15406 1726854948.61344: variable 'task' from source: play vars 15406 1726854948.61379: we have included files to process 15406 1726854948.61380: generating all_blocks data 15406 1726854948.61381: done generating all_blocks data 15406 1726854948.61383: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854948.61384: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854948.61386: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854948.61614: done processing included file 15406 1726854948.61616: iterating over new_blocks loaded from include file 15406 1726854948.61617: in VariableManager get_vars() 15406 1726854948.61706: done with get_vars() 15406 1726854948.61708: filtering new block on tags 15406 1726854948.61725: done filtering new block on tags 15406 1726854948.61728: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 15406 1726854948.61733: extending task lists for all hosts with included blocks 15406 1726854948.61857: done extending task lists 15406 1726854948.61858: done processing included files 15406 1726854948.61859: results queue empty 15406 1726854948.61860: checking for any_errors_fatal 15406 1726854948.61866: done checking for any_errors_fatal 15406 1726854948.61867: checking for max_fail_percentage 15406 1726854948.61868: done checking for max_fail_percentage 15406 1726854948.61869: checking to see if all hosts have failed and the running result is not ok 15406 1726854948.61870: done checking to see if all hosts have failed 15406 1726854948.61871: getting the remaining hosts for this loop 15406 1726854948.61872: done getting the remaining hosts for this loop 15406 1726854948.61874: getting the next task for host managed_node2 15406 1726854948.61881: done getting next task for host managed_node2 15406 1726854948.61890: ^ task is: TASK: Get stat for interface {{ interface }} 15406 1726854948.61893: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854948.61895: getting variables 15406 1726854948.61896: in VariableManager get_vars() 15406 1726854948.61905: Calling all_inventory to load vars for managed_node2 15406 1726854948.61907: Calling groups_inventory to load vars for managed_node2 15406 1726854948.61910: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854948.61915: Calling all_plugins_play to load vars for managed_node2 15406 1726854948.61917: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854948.61920: Calling groups_plugins_play to load vars for managed_node2 15406 1726854948.63110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854948.64836: done with get_vars() 15406 1726854948.64865: done getting variables 15406 1726854948.65048: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:55:48 -0400 (0:00:00.099) 0:00:16.473 ****** 15406 1726854948.65080: entering _queue_task() for managed_node2/stat 15406 1726854948.65496: worker is 1 (out of 1 available) 15406 1726854948.65511: exiting _queue_task() for managed_node2/stat 15406 1726854948.65528: done queuing things up, now waiting for results queue to drain 15406 1726854948.65530: waiting for pending results... 15406 1726854948.65862: running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 15406 1726854948.66077: in run() - task 0affcc66-ac2b-3c83-32d3-000000000235 15406 1726854948.66081: variable 'ansible_search_path' from source: unknown 15406 1726854948.66084: variable 'ansible_search_path' from source: unknown 15406 1726854948.66086: calling self._execute() 15406 1726854948.66129: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854948.66132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854948.66140: variable 'omit' from source: magic vars 15406 1726854948.66592: variable 'ansible_distribution_major_version' from source: facts 15406 1726854948.66596: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854948.66599: variable 'omit' from source: magic vars 15406 1726854948.66645: variable 'omit' from source: magic vars 15406 1726854948.66742: variable 'interface' from source: set_fact 15406 1726854948.66761: variable 'omit' from source: magic vars 15406 1726854948.66816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854948.66951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854948.66955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854948.66957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854948.66960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854948.66962: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854948.66964: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854948.66966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854948.67059: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854948.67072: Set connection var ansible_timeout to 10 15406 1726854948.67075: Set connection var ansible_connection to ssh 15406 1726854948.67077: Set connection var ansible_shell_type to sh 15406 1726854948.67092: Set connection var ansible_shell_executable to /bin/sh 15406 1726854948.67095: Set connection var ansible_pipelining to False 15406 1726854948.67121: variable 'ansible_shell_executable' from source: unknown 15406 1726854948.67125: variable 'ansible_connection' from source: unknown 15406 1726854948.67127: variable 'ansible_module_compression' from source: unknown 15406 1726854948.67129: variable 'ansible_shell_type' from source: unknown 15406 1726854948.67132: variable 'ansible_shell_executable' from source: unknown 15406 1726854948.67134: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854948.67136: variable 'ansible_pipelining' from source: unknown 15406 1726854948.67139: variable 'ansible_timeout' from source: unknown 15406 1726854948.67144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854948.67386: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854948.67392: variable 'omit' from source: magic vars 15406 1726854948.67395: starting attempt loop 15406 1726854948.67397: running the handler 15406 1726854948.67400: _low_level_execute_command(): starting 15406 1726854948.67402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854948.68159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854948.68196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854948.68342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854948.68346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854948.68348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854948.68422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.70231: stdout chunk (state=3): >>>/root <<< 15406 1726854948.70354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854948.70358: stdout chunk (state=3): >>><<< 15406 1726854948.70360: stderr chunk (state=3): >>><<< 15406 1726854948.70383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854948.70413: _low_level_execute_command(): starting 15406 1726854948.70494: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432 `" && echo ansible-tmp-1726854948.703954-16209-62408862508432="` echo /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432 `" ) && sleep 0' 15406 1726854948.71165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854948.71181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854948.71200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854948.71225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854948.71247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854948.71274: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854948.71381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854948.71400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854948.71413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854948.71528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.73457: stdout chunk (state=3): >>>ansible-tmp-1726854948.703954-16209-62408862508432=/root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432 <<< 15406 1726854948.73615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854948.73618: stdout chunk (state=3): >>><<< 15406 1726854948.73620: stderr chunk (state=3): >>><<< 15406 1726854948.73694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854948.703954-16209-62408862508432=/root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854948.73697: variable 'ansible_module_compression' from source: unknown 15406 1726854948.73758: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15406 1726854948.73997: variable 'ansible_facts' from source: unknown 15406 1726854948.74001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py 15406 1726854948.74128: Sending initial data 15406 1726854948.74131: Sent initial data (151 bytes) 15406 1726854948.75004: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854948.75070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854948.75108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854948.75119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854948.75128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854948.75230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.76892: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15406 1726854948.76896: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854948.76940: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854948.77051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp2u_myhfa /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py <<< 15406 1726854948.77056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py" <<< 15406 1726854948.77146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp2u_myhfa" to remote "/root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py" <<< 15406 1726854948.78405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854948.78408: stderr chunk (state=3): >>><<< 15406 1726854948.78411: stdout chunk (state=3): >>><<< 15406 1726854948.78428: done transferring module to remote 15406 1726854948.78515: _low_level_execute_command(): starting 15406 1726854948.78520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/ /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py && sleep 0' 15406 1726854948.79101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854948.79111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854948.79122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854948.79137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854948.79182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854948.79195: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854948.79288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854948.79301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854948.79365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.81194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854948.81198: stdout chunk (state=3): >>><<< 15406 1726854948.81203: stderr chunk (state=3): >>><<< 15406 1726854948.81219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854948.81228: _low_level_execute_command(): starting 15406 1726854948.81302: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/AnsiballZ_stat.py && sleep 0' 15406 1726854948.81904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854948.81922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854948.81938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854948.81958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854948.82064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854948.97280: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28963, "dev": 23, "nlink": 1, "atime": 1726854945.7067125, "mtime": 1726854945.7067125, "ctime": 1726854945.7067125, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15406 1726854948.98725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854948.98730: stdout chunk (state=3): >>><<< 15406 1726854948.98732: stderr chunk (state=3): >>><<< 15406 1726854948.98735: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28963, "dev": 23, "nlink": 1, "atime": 1726854945.7067125, "mtime": 1726854945.7067125, "ctime": 1726854945.7067125, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854948.98738: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854948.98742: _low_level_execute_command(): starting 15406 1726854948.98745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854948.703954-16209-62408862508432/ > /dev/null 2>&1 && sleep 0' 15406 1726854949.00140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854949.00147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854949.00149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854949.00151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854949.00154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854949.00286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854949.00303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854949.00392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854949.02273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854949.02276: stdout chunk (state=3): >>><<< 15406 1726854949.02285: stderr chunk (state=3): >>><<< 15406 1726854949.02345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854949.02351: handler run complete 15406 1726854949.02490: attempt loop complete, returning result 15406 1726854949.02496: _execute() done 15406 1726854949.02499: dumping result to json 15406 1726854949.02505: done dumping result, returning 15406 1726854949.02514: done running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000235] 15406 1726854949.02519: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000235 15406 1726854949.02634: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000235 15406 1726854949.02637: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726854945.7067125, "block_size": 4096, "blocks": 0, "ctime": 1726854945.7067125, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28963, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1726854945.7067125, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15406 1726854949.02803: no more pending results, returning what we have 15406 1726854949.02808: results queue empty 15406 1726854949.02809: checking for any_errors_fatal 15406 1726854949.02810: done checking for any_errors_fatal 15406 1726854949.02811: checking for max_fail_percentage 15406 1726854949.02813: done checking for max_fail_percentage 15406 1726854949.02815: checking to see if all hosts have failed and the running result is not ok 15406 1726854949.02816: done checking to see if all hosts have failed 15406 1726854949.02817: getting the remaining hosts for this loop 15406 1726854949.02818: done getting the remaining hosts for this loop 15406 1726854949.02822: getting the next task for host managed_node2 15406 1726854949.02831: done getting next task for host managed_node2 15406 1726854949.02834: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15406 1726854949.02837: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854949.02842: getting variables 15406 1726854949.02844: in VariableManager get_vars() 15406 1726854949.02872: Calling all_inventory to load vars for managed_node2 15406 1726854949.02875: Calling groups_inventory to load vars for managed_node2 15406 1726854949.02878: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854949.03599: Calling all_plugins_play to load vars for managed_node2 15406 1726854949.03605: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854949.03609: Calling groups_plugins_play to load vars for managed_node2 15406 1726854949.07245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854949.10720: done with get_vars() 15406 1726854949.10753: done getting variables 15406 1726854949.10874: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854949.11018: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:55:49 -0400 (0:00:00.459) 0:00:16.933 ****** 15406 1726854949.11063: entering _queue_task() for managed_node2/assert 15406 1726854949.11508: worker is 1 (out of 1 available) 15406 1726854949.11523: exiting _queue_task() for managed_node2/assert 15406 1726854949.11534: done queuing things up, now waiting for results queue to drain 15406 1726854949.11536: waiting for pending results... 15406 1726854949.12304: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'LSR-TST-br31' 15406 1726854949.12310: in run() - task 0affcc66-ac2b-3c83-32d3-00000000022b 15406 1726854949.12313: variable 'ansible_search_path' from source: unknown 15406 1726854949.12316: variable 'ansible_search_path' from source: unknown 15406 1726854949.12318: calling self._execute() 15406 1726854949.12320: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854949.12323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854949.12326: variable 'omit' from source: magic vars 15406 1726854949.12692: variable 'ansible_distribution_major_version' from source: facts 15406 1726854949.12696: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854949.12699: variable 'omit' from source: magic vars 15406 1726854949.12701: variable 'omit' from source: magic vars 15406 1726854949.12703: variable 'interface' from source: set_fact 15406 1726854949.12718: variable 'omit' from source: magic vars 15406 1726854949.12759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854949.12806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854949.12826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854949.12843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854949.12855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854949.12899: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854949.12902: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854949.12905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854949.13008: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854949.13016: Set connection var ansible_timeout to 10 15406 1726854949.13018: Set connection var ansible_connection to ssh 15406 1726854949.13024: Set connection var ansible_shell_type to sh 15406 1726854949.13029: Set connection var ansible_shell_executable to /bin/sh 15406 1726854949.13036: Set connection var ansible_pipelining to False 15406 1726854949.13062: variable 'ansible_shell_executable' from source: unknown 15406 1726854949.13065: variable 'ansible_connection' from source: unknown 15406 1726854949.13068: variable 'ansible_module_compression' from source: unknown 15406 1726854949.13070: variable 'ansible_shell_type' from source: unknown 15406 1726854949.13073: variable 'ansible_shell_executable' from source: unknown 15406 1726854949.13075: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854949.13080: variable 'ansible_pipelining' from source: unknown 15406 1726854949.13083: variable 'ansible_timeout' from source: unknown 15406 1726854949.13090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854949.13245: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854949.13254: variable 'omit' from source: magic vars 15406 1726854949.13260: starting attempt loop 15406 1726854949.13263: running the handler 15406 1726854949.13592: variable 'interface_stat' from source: set_fact 15406 1726854949.13596: Evaluated conditional (interface_stat.stat.exists): True 15406 1726854949.13598: handler run complete 15406 1726854949.13599: attempt loop complete, returning result 15406 1726854949.13601: _execute() done 15406 1726854949.13602: dumping result to json 15406 1726854949.13604: done dumping result, returning 15406 1726854949.13606: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'LSR-TST-br31' [0affcc66-ac2b-3c83-32d3-00000000022b] 15406 1726854949.13608: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000022b 15406 1726854949.13666: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000022b 15406 1726854949.13669: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854949.13721: no more pending results, returning what we have 15406 1726854949.13725: results queue empty 15406 1726854949.13726: checking for any_errors_fatal 15406 1726854949.13735: done checking for any_errors_fatal 15406 1726854949.13736: checking for max_fail_percentage 15406 1726854949.13738: done checking for max_fail_percentage 15406 1726854949.13739: checking to see if all hosts have failed and the running result is not ok 15406 1726854949.13740: done checking to see if all hosts have failed 15406 1726854949.13741: getting the remaining hosts for this loop 15406 1726854949.13742: done getting the remaining hosts for this loop 15406 1726854949.13746: getting the next task for host managed_node2 15406 1726854949.13762: done getting next task for host managed_node2 15406 1726854949.13765: ^ task is: TASK: meta (flush_handlers) 15406 1726854949.13767: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854949.13771: getting variables 15406 1726854949.13772: in VariableManager get_vars() 15406 1726854949.13904: Calling all_inventory to load vars for managed_node2 15406 1726854949.13907: Calling groups_inventory to load vars for managed_node2 15406 1726854949.13910: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854949.13919: Calling all_plugins_play to load vars for managed_node2 15406 1726854949.13923: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854949.13925: Calling groups_plugins_play to load vars for managed_node2 15406 1726854949.15653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854949.18083: done with get_vars() 15406 1726854949.18111: done getting variables 15406 1726854949.18207: in VariableManager get_vars() 15406 1726854949.18217: Calling all_inventory to load vars for managed_node2 15406 1726854949.18220: Calling groups_inventory to load vars for managed_node2 15406 1726854949.18222: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854949.18227: Calling all_plugins_play to load vars for managed_node2 15406 1726854949.18229: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854949.18237: Calling groups_plugins_play to load vars for managed_node2 15406 1726854949.19965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854949.21889: done with get_vars() 15406 1726854949.21924: done queuing things up, now waiting for results queue to drain 15406 1726854949.21926: results queue empty 15406 1726854949.21927: checking for any_errors_fatal 15406 1726854949.21930: done checking for any_errors_fatal 15406 1726854949.21931: checking for max_fail_percentage 15406 1726854949.21932: done checking for max_fail_percentage 15406 1726854949.21933: checking to see if all hosts have failed and the running result is not ok 15406 1726854949.21934: done checking to see if all hosts have failed 15406 1726854949.21939: getting the remaining hosts for this loop 15406 1726854949.21940: done getting the remaining hosts for this loop 15406 1726854949.21943: getting the next task for host managed_node2 15406 1726854949.21947: done getting next task for host managed_node2 15406 1726854949.21948: ^ task is: TASK: meta (flush_handlers) 15406 1726854949.21950: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854949.21952: getting variables 15406 1726854949.21953: in VariableManager get_vars() 15406 1726854949.21961: Calling all_inventory to load vars for managed_node2 15406 1726854949.21963: Calling groups_inventory to load vars for managed_node2 15406 1726854949.21966: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854949.21971: Calling all_plugins_play to load vars for managed_node2 15406 1726854949.21973: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854949.21976: Calling groups_plugins_play to load vars for managed_node2 15406 1726854949.23266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854949.25157: done with get_vars() 15406 1726854949.25179: done getting variables 15406 1726854949.25237: in VariableManager get_vars() 15406 1726854949.25247: Calling all_inventory to load vars for managed_node2 15406 1726854949.25249: Calling groups_inventory to load vars for managed_node2 15406 1726854949.25251: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854949.25256: Calling all_plugins_play to load vars for managed_node2 15406 1726854949.25259: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854949.25261: Calling groups_plugins_play to load vars for managed_node2 15406 1726854949.26724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854949.28339: done with get_vars() 15406 1726854949.28366: done queuing things up, now waiting for results queue to drain 15406 1726854949.28369: results queue empty 15406 1726854949.28369: checking for any_errors_fatal 15406 1726854949.28371: done checking for any_errors_fatal 15406 1726854949.28371: checking for max_fail_percentage 15406 1726854949.28372: done checking for max_fail_percentage 15406 1726854949.28373: checking to see if all hosts have failed and the running result is not ok 15406 1726854949.28374: done checking to see if all hosts have failed 15406 1726854949.28375: getting the remaining hosts for this loop 15406 1726854949.28376: done getting the remaining hosts for this loop 15406 1726854949.28378: getting the next task for host managed_node2 15406 1726854949.28382: done getting next task for host managed_node2 15406 1726854949.28383: ^ task is: None 15406 1726854949.28384: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854949.28385: done queuing things up, now waiting for results queue to drain 15406 1726854949.28386: results queue empty 15406 1726854949.28388: checking for any_errors_fatal 15406 1726854949.28389: done checking for any_errors_fatal 15406 1726854949.28390: checking for max_fail_percentage 15406 1726854949.28391: done checking for max_fail_percentage 15406 1726854949.28391: checking to see if all hosts have failed and the running result is not ok 15406 1726854949.28392: done checking to see if all hosts have failed 15406 1726854949.28393: getting the next task for host managed_node2 15406 1726854949.28395: done getting next task for host managed_node2 15406 1726854949.28396: ^ task is: None 15406 1726854949.28397: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854949.28442: in VariableManager get_vars() 15406 1726854949.28458: done with get_vars() 15406 1726854949.28468: in VariableManager get_vars() 15406 1726854949.28478: done with get_vars() 15406 1726854949.28483: variable 'omit' from source: magic vars 15406 1726854949.28603: variable 'task' from source: play vars 15406 1726854949.28634: in VariableManager get_vars() 15406 1726854949.28645: done with get_vars() 15406 1726854949.28663: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15406 1726854949.28911: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854949.28937: getting the remaining hosts for this loop 15406 1726854949.28938: done getting the remaining hosts for this loop 15406 1726854949.28941: getting the next task for host managed_node2 15406 1726854949.28944: done getting next task for host managed_node2 15406 1726854949.28946: ^ task is: TASK: Gathering Facts 15406 1726854949.28947: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854949.28949: getting variables 15406 1726854949.28950: in VariableManager get_vars() 15406 1726854949.28958: Calling all_inventory to load vars for managed_node2 15406 1726854949.28960: Calling groups_inventory to load vars for managed_node2 15406 1726854949.28962: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854949.28968: Calling all_plugins_play to load vars for managed_node2 15406 1726854949.28970: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854949.28973: Calling groups_plugins_play to load vars for managed_node2 15406 1726854949.30979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854949.33607: done with get_vars() 15406 1726854949.33630: done getting variables 15406 1726854949.33706: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:55:49 -0400 (0:00:00.226) 0:00:17.160 ****** 15406 1726854949.33734: entering _queue_task() for managed_node2/gather_facts 15406 1726854949.34206: worker is 1 (out of 1 available) 15406 1726854949.34226: exiting _queue_task() for managed_node2/gather_facts 15406 1726854949.34236: done queuing things up, now waiting for results queue to drain 15406 1726854949.34238: waiting for pending results... 15406 1726854949.34604: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854949.34609: in run() - task 0affcc66-ac2b-3c83-32d3-00000000024e 15406 1726854949.34612: variable 'ansible_search_path' from source: unknown 15406 1726854949.34627: calling self._execute() 15406 1726854949.34715: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854949.34725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854949.34738: variable 'omit' from source: magic vars 15406 1726854949.35085: variable 'ansible_distribution_major_version' from source: facts 15406 1726854949.35106: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854949.35117: variable 'omit' from source: magic vars 15406 1726854949.35147: variable 'omit' from source: magic vars 15406 1726854949.35193: variable 'omit' from source: magic vars 15406 1726854949.35237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854949.35274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854949.35303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854949.35329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854949.35350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854949.35389: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854949.35405: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854949.35414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854949.35517: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854949.35531: Set connection var ansible_timeout to 10 15406 1726854949.35539: Set connection var ansible_connection to ssh 15406 1726854949.35594: Set connection var ansible_shell_type to sh 15406 1726854949.35597: Set connection var ansible_shell_executable to /bin/sh 15406 1726854949.35600: Set connection var ansible_pipelining to False 15406 1726854949.35602: variable 'ansible_shell_executable' from source: unknown 15406 1726854949.35604: variable 'ansible_connection' from source: unknown 15406 1726854949.35606: variable 'ansible_module_compression' from source: unknown 15406 1726854949.35612: variable 'ansible_shell_type' from source: unknown 15406 1726854949.35618: variable 'ansible_shell_executable' from source: unknown 15406 1726854949.35624: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854949.35631: variable 'ansible_pipelining' from source: unknown 15406 1726854949.35640: variable 'ansible_timeout' from source: unknown 15406 1726854949.35648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854949.35824: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854949.35997: variable 'omit' from source: magic vars 15406 1726854949.36001: starting attempt loop 15406 1726854949.36004: running the handler 15406 1726854949.36007: variable 'ansible_facts' from source: unknown 15406 1726854949.36010: _low_level_execute_command(): starting 15406 1726854949.36012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854949.36711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854949.36732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854949.36757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854949.36855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854949.38595: stdout chunk (state=3): >>>/root <<< 15406 1726854949.38773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854949.38785: stdout chunk (state=3): >>><<< 15406 1726854949.38810: stderr chunk (state=3): >>><<< 15406 1726854949.38855: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854949.38913: _low_level_execute_command(): starting 15406 1726854949.38942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647 `" && echo ansible-tmp-1726854949.3889568-16238-262799001056647="` echo /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647 `" ) && sleep 0' 15406 1726854949.40002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854949.40020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854949.40103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854949.40164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854949.40187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854949.40297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854949.42306: stdout chunk (state=3): >>>ansible-tmp-1726854949.3889568-16238-262799001056647=/root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647 <<< 15406 1726854949.42436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854949.42455: stdout chunk (state=3): >>><<< 15406 1726854949.42498: stderr chunk (state=3): >>><<< 15406 1726854949.42694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854949.3889568-16238-262799001056647=/root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854949.42698: variable 'ansible_module_compression' from source: unknown 15406 1726854949.42700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854949.42702: variable 'ansible_facts' from source: unknown 15406 1726854949.43122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py 15406 1726854949.43400: Sending initial data 15406 1726854949.43403: Sent initial data (154 bytes) 15406 1726854949.45015: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854949.45118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854949.45196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854949.45220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854949.45239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854949.45261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854949.45359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854949.46926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854949.47011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854949.47114: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpkc8tzitq /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py <<< 15406 1726854949.47117: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py" <<< 15406 1726854949.47192: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpkc8tzitq" to remote "/root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py" <<< 15406 1726854949.48915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854949.48920: stdout chunk (state=3): >>><<< 15406 1726854949.48922: stderr chunk (state=3): >>><<< 15406 1726854949.48949: done transferring module to remote 15406 1726854949.49106: _low_level_execute_command(): starting 15406 1726854949.49109: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/ /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py && sleep 0' 15406 1726854949.49723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854949.49746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854949.49775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854949.49883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854949.49928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854949.50005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854949.51769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854949.51810: stderr chunk (state=3): >>><<< 15406 1726854949.51813: stdout chunk (state=3): >>><<< 15406 1726854949.51830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854949.51833: _low_level_execute_command(): starting 15406 1726854949.51838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/AnsiballZ_setup.py && sleep 0' 15406 1726854949.52385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854949.52403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854949.52406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854949.52474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854949.52546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854950.15745: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "49", "epoch": "1726854949", "epoch_int": "1726854949", "date": "2024-09-20", "time": "13:55:49", "iso8601_micro": "2024-09-20T17:55:49.795680Z", "iso8601": "2024-09-20T17:55:49Z", "iso8601_basic": "20240920T135549795680", "iso8601_basic_short": "20240920T135549", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_loadavg": {"1m": 0.42333984375, "5m": 0.36083984375, "15m": 0.1796875}, "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "46:8a:ac:06:3d:16", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all"<<< 15406 1726854950.15757: stdout chunk (state=3): >>>: "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 733, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797163008, "block_size": 4096, "block_total": 65519099, "block_available": 63915323, "block_used": 1603776, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854950.17878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854950.17884: stdout chunk (state=3): >>><<< 15406 1726854950.17886: stderr chunk (state=3): >>><<< 15406 1726854950.18100: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "49", "epoch": "1726854949", "epoch_int": "1726854949", "date": "2024-09-20", "time": "13:55:49", "iso8601_micro": "2024-09-20T17:55:49.795680Z", "iso8601": "2024-09-20T17:55:49Z", "iso8601_basic": "20240920T135549795680", "iso8601_basic_short": "20240920T135549", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_loadavg": {"1m": 0.42333984375, "5m": 0.36083984375, "15m": 0.1796875}, "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "46:8a:ac:06:3d:16", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 733, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797163008, "block_size": 4096, "block_total": 65519099, "block_available": 63915323, "block_used": 1603776, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854950.18999: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854950.19003: _low_level_execute_command(): starting 15406 1726854950.19005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854949.3889568-16238-262799001056647/ > /dev/null 2>&1 && sleep 0' 15406 1726854950.20646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854950.20662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854950.20672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.20909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854950.20922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854950.21013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854950.22940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854950.22954: stderr chunk (state=3): >>><<< 15406 1726854950.22962: stdout chunk (state=3): >>><<< 15406 1726854950.22982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854950.23002: handler run complete 15406 1726854950.23219: variable 'ansible_facts' from source: unknown 15406 1726854950.23343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.23703: variable 'ansible_facts' from source: unknown 15406 1726854950.23798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.23945: attempt loop complete, returning result 15406 1726854950.23955: _execute() done 15406 1726854950.23962: dumping result to json 15406 1726854950.24008: done dumping result, returning 15406 1726854950.24020: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-00000000024e] 15406 1726854950.24028: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000024e ok: [managed_node2] 15406 1726854950.25014: no more pending results, returning what we have 15406 1726854950.25017: results queue empty 15406 1726854950.25018: checking for any_errors_fatal 15406 1726854950.25019: done checking for any_errors_fatal 15406 1726854950.25020: checking for max_fail_percentage 15406 1726854950.25022: done checking for max_fail_percentage 15406 1726854950.25023: checking to see if all hosts have failed and the running result is not ok 15406 1726854950.25024: done checking to see if all hosts have failed 15406 1726854950.25025: getting the remaining hosts for this loop 15406 1726854950.25026: done getting the remaining hosts for this loop 15406 1726854950.25030: getting the next task for host managed_node2 15406 1726854950.25035: done getting next task for host managed_node2 15406 1726854950.25037: ^ task is: TASK: meta (flush_handlers) 15406 1726854950.25039: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854950.25043: getting variables 15406 1726854950.25044: in VariableManager get_vars() 15406 1726854950.25067: Calling all_inventory to load vars for managed_node2 15406 1726854950.25069: Calling groups_inventory to load vars for managed_node2 15406 1726854950.25073: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.25081: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000024e 15406 1726854950.25084: WORKER PROCESS EXITING 15406 1726854950.25095: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.25098: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.25102: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.26511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.28975: done with get_vars() 15406 1726854950.29010: done getting variables 15406 1726854950.29090: in VariableManager get_vars() 15406 1726854950.29102: Calling all_inventory to load vars for managed_node2 15406 1726854950.29105: Calling groups_inventory to load vars for managed_node2 15406 1726854950.29107: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.29112: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.29115: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.29117: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.30329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.32385: done with get_vars() 15406 1726854950.32454: done queuing things up, now waiting for results queue to drain 15406 1726854950.32456: results queue empty 15406 1726854950.32457: checking for any_errors_fatal 15406 1726854950.32460: done checking for any_errors_fatal 15406 1726854950.32461: checking for max_fail_percentage 15406 1726854950.32462: done checking for max_fail_percentage 15406 1726854950.32463: checking to see if all hosts have failed and the running result is not ok 15406 1726854950.32468: done checking to see if all hosts have failed 15406 1726854950.32468: getting the remaining hosts for this loop 15406 1726854950.32469: done getting the remaining hosts for this loop 15406 1726854950.32472: getting the next task for host managed_node2 15406 1726854950.32480: done getting next task for host managed_node2 15406 1726854950.32483: ^ task is: TASK: Include the task '{{ task }}' 15406 1726854950.32485: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854950.32490: getting variables 15406 1726854950.32493: in VariableManager get_vars() 15406 1726854950.32503: Calling all_inventory to load vars for managed_node2 15406 1726854950.32519: Calling groups_inventory to load vars for managed_node2 15406 1726854950.32522: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.32527: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.32529: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.32532: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.33989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.36423: done with get_vars() 15406 1726854950.36514: done getting variables 15406 1726854950.36926: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:55:50 -0400 (0:00:01.032) 0:00:18.192 ****** 15406 1726854950.36963: entering _queue_task() for managed_node2/include_tasks 15406 1726854950.37827: worker is 1 (out of 1 available) 15406 1726854950.37840: exiting _queue_task() for managed_node2/include_tasks 15406 1726854950.37853: done queuing things up, now waiting for results queue to drain 15406 1726854950.37854: waiting for pending results... 15406 1726854950.38171: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_present.yml' 15406 1726854950.38331: in run() - task 0affcc66-ac2b-3c83-32d3-000000000031 15406 1726854950.38393: variable 'ansible_search_path' from source: unknown 15406 1726854950.38423: calling self._execute() 15406 1726854950.38559: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.38572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.38606: variable 'omit' from source: magic vars 15406 1726854950.39150: variable 'ansible_distribution_major_version' from source: facts 15406 1726854950.39154: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854950.39157: variable 'task' from source: play vars 15406 1726854950.39215: variable 'task' from source: play vars 15406 1726854950.39229: _execute() done 15406 1726854950.39237: dumping result to json 15406 1726854950.39245: done dumping result, returning 15406 1726854950.39366: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_present.yml' [0affcc66-ac2b-3c83-32d3-000000000031] 15406 1726854950.39369: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000031 15406 1726854950.39447: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000031 15406 1726854950.39451: WORKER PROCESS EXITING 15406 1726854950.39483: no more pending results, returning what we have 15406 1726854950.39490: in VariableManager get_vars() 15406 1726854950.39524: Calling all_inventory to load vars for managed_node2 15406 1726854950.39527: Calling groups_inventory to load vars for managed_node2 15406 1726854950.39530: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.39544: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.39546: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.39548: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.40869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.41746: done with get_vars() 15406 1726854950.41761: variable 'ansible_search_path' from source: unknown 15406 1726854950.41772: we have included files to process 15406 1726854950.41773: generating all_blocks data 15406 1726854950.41774: done generating all_blocks data 15406 1726854950.41775: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15406 1726854950.41775: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15406 1726854950.41777: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15406 1726854950.41911: in VariableManager get_vars() 15406 1726854950.41922: done with get_vars() 15406 1726854950.42095: done processing included file 15406 1726854950.42097: iterating over new_blocks loaded from include file 15406 1726854950.42098: in VariableManager get_vars() 15406 1726854950.42106: done with get_vars() 15406 1726854950.42107: filtering new block on tags 15406 1726854950.42120: done filtering new block on tags 15406 1726854950.42121: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 15406 1726854950.42125: extending task lists for all hosts with included blocks 15406 1726854950.42143: done extending task lists 15406 1726854950.42144: done processing included files 15406 1726854950.42144: results queue empty 15406 1726854950.42144: checking for any_errors_fatal 15406 1726854950.42146: done checking for any_errors_fatal 15406 1726854950.42147: checking for max_fail_percentage 15406 1726854950.42147: done checking for max_fail_percentage 15406 1726854950.42148: checking to see if all hosts have failed and the running result is not ok 15406 1726854950.42149: done checking to see if all hosts have failed 15406 1726854950.42150: getting the remaining hosts for this loop 15406 1726854950.42150: done getting the remaining hosts for this loop 15406 1726854950.42152: getting the next task for host managed_node2 15406 1726854950.42155: done getting next task for host managed_node2 15406 1726854950.42156: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15406 1726854950.42159: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854950.42161: getting variables 15406 1726854950.42161: in VariableManager get_vars() 15406 1726854950.42167: Calling all_inventory to load vars for managed_node2 15406 1726854950.42168: Calling groups_inventory to load vars for managed_node2 15406 1726854950.42170: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.42173: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.42175: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.42176: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.43997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.45186: done with get_vars() 15406 1726854950.45205: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:55:50 -0400 (0:00:00.082) 0:00:18.275 ****** 15406 1726854950.45256: entering _queue_task() for managed_node2/include_tasks 15406 1726854950.45507: worker is 1 (out of 1 available) 15406 1726854950.45521: exiting _queue_task() for managed_node2/include_tasks 15406 1726854950.45533: done queuing things up, now waiting for results queue to drain 15406 1726854950.45534: waiting for pending results... 15406 1726854950.45708: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 15406 1726854950.45777: in run() - task 0affcc66-ac2b-3c83-32d3-00000000025f 15406 1726854950.45788: variable 'ansible_search_path' from source: unknown 15406 1726854950.45795: variable 'ansible_search_path' from source: unknown 15406 1726854950.45819: calling self._execute() 15406 1726854950.45904: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.45910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.46096: variable 'omit' from source: magic vars 15406 1726854950.46467: variable 'ansible_distribution_major_version' from source: facts 15406 1726854950.46484: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854950.46501: _execute() done 15406 1726854950.46509: dumping result to json 15406 1726854950.46517: done dumping result, returning 15406 1726854950.46527: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-3c83-32d3-00000000025f] 15406 1726854950.46536: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000025f 15406 1726854950.46678: no more pending results, returning what we have 15406 1726854950.46684: in VariableManager get_vars() 15406 1726854950.46720: Calling all_inventory to load vars for managed_node2 15406 1726854950.46725: Calling groups_inventory to load vars for managed_node2 15406 1726854950.46730: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.46746: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.46755: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.46760: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.47301: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000025f 15406 1726854950.47305: WORKER PROCESS EXITING 15406 1726854950.48517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.51811: done with get_vars() 15406 1726854950.51841: variable 'ansible_search_path' from source: unknown 15406 1726854950.51842: variable 'ansible_search_path' from source: unknown 15406 1726854950.51853: variable 'task' from source: play vars 15406 1726854950.52116: variable 'task' from source: play vars 15406 1726854950.52152: we have included files to process 15406 1726854950.52153: generating all_blocks data 15406 1726854950.52155: done generating all_blocks data 15406 1726854950.52156: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15406 1726854950.52157: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15406 1726854950.52159: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15406 1726854950.54347: done processing included file 15406 1726854950.54349: iterating over new_blocks loaded from include file 15406 1726854950.54351: in VariableManager get_vars() 15406 1726854950.54364: done with get_vars() 15406 1726854950.54365: filtering new block on tags 15406 1726854950.54390: done filtering new block on tags 15406 1726854950.54396: in VariableManager get_vars() 15406 1726854950.54408: done with get_vars() 15406 1726854950.54409: filtering new block on tags 15406 1726854950.54430: done filtering new block on tags 15406 1726854950.54432: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 15406 1726854950.54437: extending task lists for all hosts with included blocks 15406 1726854950.54615: done extending task lists 15406 1726854950.54616: done processing included files 15406 1726854950.54617: results queue empty 15406 1726854950.54618: checking for any_errors_fatal 15406 1726854950.54621: done checking for any_errors_fatal 15406 1726854950.54622: checking for max_fail_percentage 15406 1726854950.54623: done checking for max_fail_percentage 15406 1726854950.54623: checking to see if all hosts have failed and the running result is not ok 15406 1726854950.54624: done checking to see if all hosts have failed 15406 1726854950.54625: getting the remaining hosts for this loop 15406 1726854950.54626: done getting the remaining hosts for this loop 15406 1726854950.54629: getting the next task for host managed_node2 15406 1726854950.54633: done getting next task for host managed_node2 15406 1726854950.54635: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15406 1726854950.54638: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854950.54640: getting variables 15406 1726854950.54641: in VariableManager get_vars() 15406 1726854950.59183: Calling all_inventory to load vars for managed_node2 15406 1726854950.59186: Calling groups_inventory to load vars for managed_node2 15406 1726854950.59193: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.59199: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.59202: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.59205: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.60383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.61963: done with get_vars() 15406 1726854950.61985: done getting variables 15406 1726854950.62031: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:55:50 -0400 (0:00:00.167) 0:00:18.443 ****** 15406 1726854950.62058: entering _queue_task() for managed_node2/set_fact 15406 1726854950.62408: worker is 1 (out of 1 available) 15406 1726854950.62420: exiting _queue_task() for managed_node2/set_fact 15406 1726854950.62432: done queuing things up, now waiting for results queue to drain 15406 1726854950.62434: waiting for pending results... 15406 1726854950.62909: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 15406 1726854950.63298: in run() - task 0affcc66-ac2b-3c83-32d3-00000000026c 15406 1726854950.63302: variable 'ansible_search_path' from source: unknown 15406 1726854950.63305: variable 'ansible_search_path' from source: unknown 15406 1726854950.63308: calling self._execute() 15406 1726854950.63397: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.63596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.63600: variable 'omit' from source: magic vars 15406 1726854950.64090: variable 'ansible_distribution_major_version' from source: facts 15406 1726854950.64109: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854950.64120: variable 'omit' from source: magic vars 15406 1726854950.64198: variable 'omit' from source: magic vars 15406 1726854950.64238: variable 'omit' from source: magic vars 15406 1726854950.64289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854950.64332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854950.64357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854950.64380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854950.64405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854950.64440: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854950.64449: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.64457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.64559: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854950.64574: Set connection var ansible_timeout to 10 15406 1726854950.64582: Set connection var ansible_connection to ssh 15406 1726854950.64598: Set connection var ansible_shell_type to sh 15406 1726854950.64612: Set connection var ansible_shell_executable to /bin/sh 15406 1726854950.64623: Set connection var ansible_pipelining to False 15406 1726854950.64652: variable 'ansible_shell_executable' from source: unknown 15406 1726854950.64660: variable 'ansible_connection' from source: unknown 15406 1726854950.64667: variable 'ansible_module_compression' from source: unknown 15406 1726854950.64675: variable 'ansible_shell_type' from source: unknown 15406 1726854950.64683: variable 'ansible_shell_executable' from source: unknown 15406 1726854950.64693: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.64701: variable 'ansible_pipelining' from source: unknown 15406 1726854950.64713: variable 'ansible_timeout' from source: unknown 15406 1726854950.64723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.64867: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854950.64931: variable 'omit' from source: magic vars 15406 1726854950.64934: starting attempt loop 15406 1726854950.64936: running the handler 15406 1726854950.64938: handler run complete 15406 1726854950.64940: attempt loop complete, returning result 15406 1726854950.64942: _execute() done 15406 1726854950.64945: dumping result to json 15406 1726854950.64947: done dumping result, returning 15406 1726854950.64954: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-3c83-32d3-00000000026c] 15406 1726854950.64962: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026c 15406 1726854950.65221: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026c 15406 1726854950.65224: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15406 1726854950.65278: no more pending results, returning what we have 15406 1726854950.65283: results queue empty 15406 1726854950.65283: checking for any_errors_fatal 15406 1726854950.65285: done checking for any_errors_fatal 15406 1726854950.65286: checking for max_fail_percentage 15406 1726854950.65289: done checking for max_fail_percentage 15406 1726854950.65291: checking to see if all hosts have failed and the running result is not ok 15406 1726854950.65294: done checking to see if all hosts have failed 15406 1726854950.65295: getting the remaining hosts for this loop 15406 1726854950.65296: done getting the remaining hosts for this loop 15406 1726854950.65300: getting the next task for host managed_node2 15406 1726854950.65307: done getting next task for host managed_node2 15406 1726854950.65310: ^ task is: TASK: Stat profile file 15406 1726854950.65314: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854950.65319: getting variables 15406 1726854950.65320: in VariableManager get_vars() 15406 1726854950.65349: Calling all_inventory to load vars for managed_node2 15406 1726854950.65352: Calling groups_inventory to load vars for managed_node2 15406 1726854950.65355: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854950.65367: Calling all_plugins_play to load vars for managed_node2 15406 1726854950.65370: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854950.65373: Calling groups_plugins_play to load vars for managed_node2 15406 1726854950.68139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854950.70237: done with get_vars() 15406 1726854950.70263: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:55:50 -0400 (0:00:00.083) 0:00:18.526 ****** 15406 1726854950.70367: entering _queue_task() for managed_node2/stat 15406 1726854950.70720: worker is 1 (out of 1 available) 15406 1726854950.70733: exiting _queue_task() for managed_node2/stat 15406 1726854950.70744: done queuing things up, now waiting for results queue to drain 15406 1726854950.70746: waiting for pending results... 15406 1726854950.70984: running TaskExecutor() for managed_node2/TASK: Stat profile file 15406 1726854950.71123: in run() - task 0affcc66-ac2b-3c83-32d3-00000000026d 15406 1726854950.71394: variable 'ansible_search_path' from source: unknown 15406 1726854950.71399: variable 'ansible_search_path' from source: unknown 15406 1726854950.71403: calling self._execute() 15406 1726854950.71405: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.71408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.71410: variable 'omit' from source: magic vars 15406 1726854950.72260: variable 'ansible_distribution_major_version' from source: facts 15406 1726854950.72276: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854950.72327: variable 'omit' from source: magic vars 15406 1726854950.72514: variable 'omit' from source: magic vars 15406 1726854950.72613: variable 'profile' from source: play vars 15406 1726854950.72623: variable 'interface' from source: set_fact 15406 1726854950.72761: variable 'interface' from source: set_fact 15406 1726854950.72786: variable 'omit' from source: magic vars 15406 1726854950.72837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854950.72880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854950.72910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854950.72932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854950.72953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854950.72993: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854950.73003: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.73010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.73116: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854950.73128: Set connection var ansible_timeout to 10 15406 1726854950.73135: Set connection var ansible_connection to ssh 15406 1726854950.73144: Set connection var ansible_shell_type to sh 15406 1726854950.73157: Set connection var ansible_shell_executable to /bin/sh 15406 1726854950.73171: Set connection var ansible_pipelining to False 15406 1726854950.73202: variable 'ansible_shell_executable' from source: unknown 15406 1726854950.73270: variable 'ansible_connection' from source: unknown 15406 1726854950.73273: variable 'ansible_module_compression' from source: unknown 15406 1726854950.73276: variable 'ansible_shell_type' from source: unknown 15406 1726854950.73278: variable 'ansible_shell_executable' from source: unknown 15406 1726854950.73280: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854950.73283: variable 'ansible_pipelining' from source: unknown 15406 1726854950.73285: variable 'ansible_timeout' from source: unknown 15406 1726854950.73289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854950.73452: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854950.73468: variable 'omit' from source: magic vars 15406 1726854950.73477: starting attempt loop 15406 1726854950.73489: running the handler 15406 1726854950.73505: _low_level_execute_command(): starting 15406 1726854950.73514: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854950.74217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854950.74261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.74294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854950.74311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.74394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854950.74413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854950.74511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854950.76270: stdout chunk (state=3): >>>/root <<< 15406 1726854950.76488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854950.76492: stdout chunk (state=3): >>><<< 15406 1726854950.76495: stderr chunk (state=3): >>><<< 15406 1726854950.76641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854950.76645: _low_level_execute_command(): starting 15406 1726854950.76648: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174 `" && echo ansible-tmp-1726854950.7654233-16288-193943208547174="` echo /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174 `" ) && sleep 0' 15406 1726854950.77895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854950.77931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854950.77947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854950.78014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854950.78047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.78096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854950.78133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854950.78156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854950.78280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854950.80206: stdout chunk (state=3): >>>ansible-tmp-1726854950.7654233-16288-193943208547174=/root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174 <<< 15406 1726854950.80339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854950.80343: stdout chunk (state=3): >>><<< 15406 1726854950.80345: stderr chunk (state=3): >>><<< 15406 1726854950.80363: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854950.7654233-16288-193943208547174=/root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854950.80495: variable 'ansible_module_compression' from source: unknown 15406 1726854950.80498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15406 1726854950.80541: variable 'ansible_facts' from source: unknown 15406 1726854950.80635: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py 15406 1726854950.80855: Sending initial data 15406 1726854950.80859: Sent initial data (153 bytes) 15406 1726854950.81395: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854950.81411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854950.81427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854950.81505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.81546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854950.81561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854950.81598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854950.81700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854950.83311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854950.83418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854950.83521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp3e229iv4 /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py <<< 15406 1726854950.83525: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py" <<< 15406 1726854950.83649: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp3e229iv4" to remote "/root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py" <<< 15406 1726854950.84903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854950.84955: stderr chunk (state=3): >>><<< 15406 1726854950.84967: stdout chunk (state=3): >>><<< 15406 1726854950.85024: done transferring module to remote 15406 1726854950.85134: _low_level_execute_command(): starting 15406 1726854950.85138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/ /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py && sleep 0' 15406 1726854950.85919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854950.86007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.86033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854950.86048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854950.86064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854950.86232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854950.88007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854950.88049: stderr chunk (state=3): >>><<< 15406 1726854950.88063: stdout chunk (state=3): >>><<< 15406 1726854950.88156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854950.88161: _low_level_execute_command(): starting 15406 1726854950.88163: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/AnsiballZ_stat.py && sleep 0' 15406 1726854950.88955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854950.88984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854950.89075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854950.89122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854950.89143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854950.89160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854950.89306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.04484: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15406 1726854951.05688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854951.05706: stdout chunk (state=3): >>><<< 15406 1726854951.05723: stderr chunk (state=3): >>><<< 15406 1726854951.05892: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854951.05896: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854951.05899: _low_level_execute_command(): starting 15406 1726854951.05901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854950.7654233-16288-193943208547174/ > /dev/null 2>&1 && sleep 0' 15406 1726854951.07172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854951.07258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854951.07275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854951.07503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.07534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.07664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.09696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854951.09708: stdout chunk (state=3): >>><<< 15406 1726854951.09720: stderr chunk (state=3): >>><<< 15406 1726854951.09967: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854951.09971: handler run complete 15406 1726854951.09973: attempt loop complete, returning result 15406 1726854951.09975: _execute() done 15406 1726854951.09978: dumping result to json 15406 1726854951.09979: done dumping result, returning 15406 1726854951.09981: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcc66-ac2b-3c83-32d3-00000000026d] 15406 1726854951.09983: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026d 15406 1726854951.10054: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026d ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15406 1726854951.10351: no more pending results, returning what we have 15406 1726854951.10355: results queue empty 15406 1726854951.10356: checking for any_errors_fatal 15406 1726854951.10364: done checking for any_errors_fatal 15406 1726854951.10365: checking for max_fail_percentage 15406 1726854951.10367: done checking for max_fail_percentage 15406 1726854951.10368: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.10369: done checking to see if all hosts have failed 15406 1726854951.10370: getting the remaining hosts for this loop 15406 1726854951.10371: done getting the remaining hosts for this loop 15406 1726854951.10375: getting the next task for host managed_node2 15406 1726854951.10383: done getting next task for host managed_node2 15406 1726854951.10386: ^ task is: TASK: Set NM profile exist flag based on the profile files 15406 1726854951.10392: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.10397: getting variables 15406 1726854951.10398: in VariableManager get_vars() 15406 1726854951.10428: Calling all_inventory to load vars for managed_node2 15406 1726854951.10431: Calling groups_inventory to load vars for managed_node2 15406 1726854951.10435: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.10447: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.10449: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.10452: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.11502: WORKER PROCESS EXITING 15406 1726854951.14328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.17276: done with get_vars() 15406 1726854951.17406: done getting variables 15406 1726854951.17469: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:55:51 -0400 (0:00:00.471) 0:00:18.997 ****** 15406 1726854951.17507: entering _queue_task() for managed_node2/set_fact 15406 1726854951.18212: worker is 1 (out of 1 available) 15406 1726854951.18224: exiting _queue_task() for managed_node2/set_fact 15406 1726854951.18235: done queuing things up, now waiting for results queue to drain 15406 1726854951.18237: waiting for pending results... 15406 1726854951.18432: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 15406 1726854951.18607: in run() - task 0affcc66-ac2b-3c83-32d3-00000000026e 15406 1726854951.18630: variable 'ansible_search_path' from source: unknown 15406 1726854951.18638: variable 'ansible_search_path' from source: unknown 15406 1726854951.18682: calling self._execute() 15406 1726854951.18782: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.18797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.18812: variable 'omit' from source: magic vars 15406 1726854951.19240: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.19277: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.19411: variable 'profile_stat' from source: set_fact 15406 1726854951.19429: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854951.19442: when evaluation is False, skipping this task 15406 1726854951.19450: _execute() done 15406 1726854951.19457: dumping result to json 15406 1726854951.19464: done dumping result, returning 15406 1726854951.19475: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-3c83-32d3-00000000026e] 15406 1726854951.19492: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026e 15406 1726854951.19693: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026e 15406 1726854951.19701: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854951.19752: no more pending results, returning what we have 15406 1726854951.19756: results queue empty 15406 1726854951.19757: checking for any_errors_fatal 15406 1726854951.19767: done checking for any_errors_fatal 15406 1726854951.19768: checking for max_fail_percentage 15406 1726854951.19770: done checking for max_fail_percentage 15406 1726854951.19771: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.19772: done checking to see if all hosts have failed 15406 1726854951.19772: getting the remaining hosts for this loop 15406 1726854951.19774: done getting the remaining hosts for this loop 15406 1726854951.19778: getting the next task for host managed_node2 15406 1726854951.19786: done getting next task for host managed_node2 15406 1726854951.19791: ^ task is: TASK: Get NM profile info 15406 1726854951.19795: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.19799: getting variables 15406 1726854951.19801: in VariableManager get_vars() 15406 1726854951.19834: Calling all_inventory to load vars for managed_node2 15406 1726854951.19837: Calling groups_inventory to load vars for managed_node2 15406 1726854951.19841: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.19855: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.19858: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.19861: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.22321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.24004: done with get_vars() 15406 1726854951.24027: done getting variables 15406 1726854951.24123: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:55:51 -0400 (0:00:00.066) 0:00:19.064 ****** 15406 1726854951.24155: entering _queue_task() for managed_node2/shell 15406 1726854951.24157: Creating lock for shell 15406 1726854951.24705: worker is 1 (out of 1 available) 15406 1726854951.24715: exiting _queue_task() for managed_node2/shell 15406 1726854951.24725: done queuing things up, now waiting for results queue to drain 15406 1726854951.24726: waiting for pending results... 15406 1726854951.24855: running TaskExecutor() for managed_node2/TASK: Get NM profile info 15406 1726854951.25002: in run() - task 0affcc66-ac2b-3c83-32d3-00000000026f 15406 1726854951.25023: variable 'ansible_search_path' from source: unknown 15406 1726854951.25029: variable 'ansible_search_path' from source: unknown 15406 1726854951.25071: calling self._execute() 15406 1726854951.25173: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.25197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.25211: variable 'omit' from source: magic vars 15406 1726854951.25619: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.25637: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.25648: variable 'omit' from source: magic vars 15406 1726854951.25702: variable 'omit' from source: magic vars 15406 1726854951.25833: variable 'profile' from source: play vars 15406 1726854951.25837: variable 'interface' from source: set_fact 15406 1726854951.25930: variable 'interface' from source: set_fact 15406 1726854951.25933: variable 'omit' from source: magic vars 15406 1726854951.25974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854951.26023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854951.26063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854951.26149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.26152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.26157: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854951.26160: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.26163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.26274: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854951.26291: Set connection var ansible_timeout to 10 15406 1726854951.26301: Set connection var ansible_connection to ssh 15406 1726854951.26309: Set connection var ansible_shell_type to sh 15406 1726854951.26316: Set connection var ansible_shell_executable to /bin/sh 15406 1726854951.26366: Set connection var ansible_pipelining to False 15406 1726854951.26368: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.26370: variable 'ansible_connection' from source: unknown 15406 1726854951.26372: variable 'ansible_module_compression' from source: unknown 15406 1726854951.26373: variable 'ansible_shell_type' from source: unknown 15406 1726854951.26375: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.26376: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.26378: variable 'ansible_pipelining' from source: unknown 15406 1726854951.26380: variable 'ansible_timeout' from source: unknown 15406 1726854951.26381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.26526: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854951.26543: variable 'omit' from source: magic vars 15406 1726854951.26554: starting attempt loop 15406 1726854951.26585: running the handler 15406 1726854951.26590: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854951.26612: _low_level_execute_command(): starting 15406 1726854951.26624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854951.27474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.27519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854951.27542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.27597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.27667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.29360: stdout chunk (state=3): >>>/root <<< 15406 1726854951.29441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854951.29468: stderr chunk (state=3): >>><<< 15406 1726854951.29472: stdout chunk (state=3): >>><<< 15406 1726854951.29497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854951.29507: _low_level_execute_command(): starting 15406 1726854951.29512: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567 `" && echo ansible-tmp-1726854951.294956-16314-145106435151567="` echo /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567 `" ) && sleep 0' 15406 1726854951.29935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854951.29945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854951.29948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854951.29951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854951.29953: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.29993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.29999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.30075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.31973: stdout chunk (state=3): >>>ansible-tmp-1726854951.294956-16314-145106435151567=/root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567 <<< 15406 1726854951.32120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854951.32142: stderr chunk (state=3): >>><<< 15406 1726854951.32145: stdout chunk (state=3): >>><<< 15406 1726854951.32160: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854951.294956-16314-145106435151567=/root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854951.32185: variable 'ansible_module_compression' from source: unknown 15406 1726854951.32265: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15406 1726854951.32291: variable 'ansible_facts' from source: unknown 15406 1726854951.32357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py 15406 1726854951.32499: Sending initial data 15406 1726854951.32503: Sent initial data (155 bytes) 15406 1726854951.33185: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.33195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.33256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.34797: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15406 1726854951.34809: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854951.34868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854951.34926: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpnaqva3w3 /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py <<< 15406 1726854951.34930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py" <<< 15406 1726854951.35010: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpnaqva3w3" to remote "/root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py" <<< 15406 1726854951.35938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854951.35942: stdout chunk (state=3): >>><<< 15406 1726854951.35944: stderr chunk (state=3): >>><<< 15406 1726854951.35946: done transferring module to remote 15406 1726854951.35948: _low_level_execute_command(): starting 15406 1726854951.35950: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/ /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py && sleep 0' 15406 1726854951.36659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854951.36674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854951.36685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.36743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854951.36774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.36795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.36895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.38649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854951.38672: stderr chunk (state=3): >>><<< 15406 1726854951.38676: stdout chunk (state=3): >>><<< 15406 1726854951.38711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854951.38722: _low_level_execute_command(): starting 15406 1726854951.38725: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/AnsiballZ_command.py && sleep 0' 15406 1726854951.39343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854951.39347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854951.39349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854951.39362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854951.39387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.39393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854951.39395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854951.39398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.39447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854951.39451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.39460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.39540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.56186: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:55:51.542276", "end": "2024-09-20 13:55:51.559369", "delta": "0:00:00.017093", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15406 1726854951.57533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854951.57563: stderr chunk (state=3): >>><<< 15406 1726854951.57567: stdout chunk (state=3): >>><<< 15406 1726854951.57591: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:55:51.542276", "end": "2024-09-20 13:55:51.559369", "delta": "0:00:00.017093", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854951.57639: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854951.57651: _low_level_execute_command(): starting 15406 1726854951.57656: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854951.294956-16314-145106435151567/ > /dev/null 2>&1 && sleep 0' 15406 1726854951.58426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854951.58433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.58435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854951.58437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854951.58555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854951.58558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854951.58622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854951.60414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854951.60439: stderr chunk (state=3): >>><<< 15406 1726854951.60443: stdout chunk (state=3): >>><<< 15406 1726854951.60459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854951.60465: handler run complete 15406 1726854951.60514: Evaluated conditional (False): False 15406 1726854951.60520: attempt loop complete, returning result 15406 1726854951.60523: _execute() done 15406 1726854951.60547: dumping result to json 15406 1726854951.60550: done dumping result, returning 15406 1726854951.60552: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcc66-ac2b-3c83-32d3-00000000026f] 15406 1726854951.60561: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026f 15406 1726854951.60682: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000026f 15406 1726854951.60685: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017093", "end": "2024-09-20 13:55:51.559369", "rc": 0, "start": "2024-09-20 13:55:51.542276" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15406 1726854951.60784: no more pending results, returning what we have 15406 1726854951.60789: results queue empty 15406 1726854951.60789: checking for any_errors_fatal 15406 1726854951.60797: done checking for any_errors_fatal 15406 1726854951.60798: checking for max_fail_percentage 15406 1726854951.60800: done checking for max_fail_percentage 15406 1726854951.60801: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.60802: done checking to see if all hosts have failed 15406 1726854951.60803: getting the remaining hosts for this loop 15406 1726854951.60807: done getting the remaining hosts for this loop 15406 1726854951.60810: getting the next task for host managed_node2 15406 1726854951.60817: done getting next task for host managed_node2 15406 1726854951.60820: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15406 1726854951.60823: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.60827: getting variables 15406 1726854951.60829: in VariableManager get_vars() 15406 1726854951.60859: Calling all_inventory to load vars for managed_node2 15406 1726854951.60862: Calling groups_inventory to load vars for managed_node2 15406 1726854951.60865: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.60874: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.60876: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.60879: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.61877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.62825: done with get_vars() 15406 1726854951.62841: done getting variables 15406 1726854951.62885: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:55:51 -0400 (0:00:00.387) 0:00:19.451 ****** 15406 1726854951.62910: entering _queue_task() for managed_node2/set_fact 15406 1726854951.63139: worker is 1 (out of 1 available) 15406 1726854951.63152: exiting _queue_task() for managed_node2/set_fact 15406 1726854951.63164: done queuing things up, now waiting for results queue to drain 15406 1726854951.63165: waiting for pending results... 15406 1726854951.63355: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15406 1726854951.63457: in run() - task 0affcc66-ac2b-3c83-32d3-000000000270 15406 1726854951.63467: variable 'ansible_search_path' from source: unknown 15406 1726854951.63471: variable 'ansible_search_path' from source: unknown 15406 1726854951.63505: calling self._execute() 15406 1726854951.63579: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.63583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.63593: variable 'omit' from source: magic vars 15406 1726854951.63897: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.63907: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.64052: variable 'nm_profile_exists' from source: set_fact 15406 1726854951.64110: Evaluated conditional (nm_profile_exists.rc == 0): True 15406 1726854951.64113: variable 'omit' from source: magic vars 15406 1726854951.64130: variable 'omit' from source: magic vars 15406 1726854951.64152: variable 'omit' from source: magic vars 15406 1726854951.64201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854951.64224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854951.64240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854951.64253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.64262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.64306: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854951.64310: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.64312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.64374: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854951.64378: Set connection var ansible_timeout to 10 15406 1726854951.64381: Set connection var ansible_connection to ssh 15406 1726854951.64387: Set connection var ansible_shell_type to sh 15406 1726854951.64393: Set connection var ansible_shell_executable to /bin/sh 15406 1726854951.64405: Set connection var ansible_pipelining to False 15406 1726854951.64422: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.64424: variable 'ansible_connection' from source: unknown 15406 1726854951.64427: variable 'ansible_module_compression' from source: unknown 15406 1726854951.64431: variable 'ansible_shell_type' from source: unknown 15406 1726854951.64433: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.64435: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.64437: variable 'ansible_pipelining' from source: unknown 15406 1726854951.64439: variable 'ansible_timeout' from source: unknown 15406 1726854951.64442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.64559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854951.64568: variable 'omit' from source: magic vars 15406 1726854951.64573: starting attempt loop 15406 1726854951.64576: running the handler 15406 1726854951.64585: handler run complete 15406 1726854951.64597: attempt loop complete, returning result 15406 1726854951.64600: _execute() done 15406 1726854951.64603: dumping result to json 15406 1726854951.64605: done dumping result, returning 15406 1726854951.64613: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-3c83-32d3-000000000270] 15406 1726854951.64617: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000270 15406 1726854951.64732: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000270 15406 1726854951.64735: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15406 1726854951.64850: no more pending results, returning what we have 15406 1726854951.64852: results queue empty 15406 1726854951.64853: checking for any_errors_fatal 15406 1726854951.64862: done checking for any_errors_fatal 15406 1726854951.64863: checking for max_fail_percentage 15406 1726854951.64864: done checking for max_fail_percentage 15406 1726854951.64865: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.64866: done checking to see if all hosts have failed 15406 1726854951.64866: getting the remaining hosts for this loop 15406 1726854951.64868: done getting the remaining hosts for this loop 15406 1726854951.64871: getting the next task for host managed_node2 15406 1726854951.64878: done getting next task for host managed_node2 15406 1726854951.64880: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15406 1726854951.64883: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.64888: getting variables 15406 1726854951.64889: in VariableManager get_vars() 15406 1726854951.64915: Calling all_inventory to load vars for managed_node2 15406 1726854951.64918: Calling groups_inventory to load vars for managed_node2 15406 1726854951.64921: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.64929: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.64932: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.64934: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.65971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.67163: done with get_vars() 15406 1726854951.67179: done getting variables 15406 1726854951.67223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.67322: variable 'profile' from source: play vars 15406 1726854951.67325: variable 'interface' from source: set_fact 15406 1726854951.67378: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:55:51 -0400 (0:00:00.045) 0:00:19.496 ****** 15406 1726854951.67421: entering _queue_task() for managed_node2/command 15406 1726854951.67696: worker is 1 (out of 1 available) 15406 1726854951.67709: exiting _queue_task() for managed_node2/command 15406 1726854951.67721: done queuing things up, now waiting for results queue to drain 15406 1726854951.67722: waiting for pending results... 15406 1726854951.67918: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15406 1726854951.68011: in run() - task 0affcc66-ac2b-3c83-32d3-000000000272 15406 1726854951.68020: variable 'ansible_search_path' from source: unknown 15406 1726854951.68023: variable 'ansible_search_path' from source: unknown 15406 1726854951.68052: calling self._execute() 15406 1726854951.68120: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.68124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.68132: variable 'omit' from source: magic vars 15406 1726854951.68389: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.68400: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.68476: variable 'profile_stat' from source: set_fact 15406 1726854951.68491: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854951.68497: when evaluation is False, skipping this task 15406 1726854951.68500: _execute() done 15406 1726854951.68502: dumping result to json 15406 1726854951.68505: done dumping result, returning 15406 1726854951.68507: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000272] 15406 1726854951.68512: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000272 15406 1726854951.68591: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000272 15406 1726854951.68596: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854951.68665: no more pending results, returning what we have 15406 1726854951.68669: results queue empty 15406 1726854951.68670: checking for any_errors_fatal 15406 1726854951.68676: done checking for any_errors_fatal 15406 1726854951.68677: checking for max_fail_percentage 15406 1726854951.68678: done checking for max_fail_percentage 15406 1726854951.68679: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.68681: done checking to see if all hosts have failed 15406 1726854951.68682: getting the remaining hosts for this loop 15406 1726854951.68683: done getting the remaining hosts for this loop 15406 1726854951.68686: getting the next task for host managed_node2 15406 1726854951.68696: done getting next task for host managed_node2 15406 1726854951.68698: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15406 1726854951.68701: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.68704: getting variables 15406 1726854951.68705: in VariableManager get_vars() 15406 1726854951.68727: Calling all_inventory to load vars for managed_node2 15406 1726854951.68729: Calling groups_inventory to load vars for managed_node2 15406 1726854951.68732: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.68741: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.68743: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.68745: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.69584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.70888: done with get_vars() 15406 1726854951.70917: done getting variables 15406 1726854951.71000: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.71137: variable 'profile' from source: play vars 15406 1726854951.71144: variable 'interface' from source: set_fact 15406 1726854951.71226: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:55:51 -0400 (0:00:00.038) 0:00:19.535 ****** 15406 1726854951.71253: entering _queue_task() for managed_node2/set_fact 15406 1726854951.71517: worker is 1 (out of 1 available) 15406 1726854951.71528: exiting _queue_task() for managed_node2/set_fact 15406 1726854951.71540: done queuing things up, now waiting for results queue to drain 15406 1726854951.71542: waiting for pending results... 15406 1726854951.71818: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15406 1726854951.71883: in run() - task 0affcc66-ac2b-3c83-32d3-000000000273 15406 1726854951.71895: variable 'ansible_search_path' from source: unknown 15406 1726854951.71899: variable 'ansible_search_path' from source: unknown 15406 1726854951.71930: calling self._execute() 15406 1726854951.72000: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.72004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.72013: variable 'omit' from source: magic vars 15406 1726854951.72595: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.72598: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.72600: variable 'profile_stat' from source: set_fact 15406 1726854951.72603: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854951.72605: when evaluation is False, skipping this task 15406 1726854951.72607: _execute() done 15406 1726854951.72609: dumping result to json 15406 1726854951.72611: done dumping result, returning 15406 1726854951.72613: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000273] 15406 1726854951.72615: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000273 15406 1726854951.72676: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000273 15406 1726854951.72681: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854951.72727: no more pending results, returning what we have 15406 1726854951.72730: results queue empty 15406 1726854951.72733: checking for any_errors_fatal 15406 1726854951.72738: done checking for any_errors_fatal 15406 1726854951.72739: checking for max_fail_percentage 15406 1726854951.72741: done checking for max_fail_percentage 15406 1726854951.72742: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.72742: done checking to see if all hosts have failed 15406 1726854951.72743: getting the remaining hosts for this loop 15406 1726854951.72744: done getting the remaining hosts for this loop 15406 1726854951.72748: getting the next task for host managed_node2 15406 1726854951.72755: done getting next task for host managed_node2 15406 1726854951.72757: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15406 1726854951.72760: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.72764: getting variables 15406 1726854951.72766: in VariableManager get_vars() 15406 1726854951.72965: Calling all_inventory to load vars for managed_node2 15406 1726854951.72968: Calling groups_inventory to load vars for managed_node2 15406 1726854951.72971: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.72981: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.72984: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.72989: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.74262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.75141: done with get_vars() 15406 1726854951.75155: done getting variables 15406 1726854951.75200: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.75272: variable 'profile' from source: play vars 15406 1726854951.75275: variable 'interface' from source: set_fact 15406 1726854951.75317: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:55:51 -0400 (0:00:00.040) 0:00:19.576 ****** 15406 1726854951.75339: entering _queue_task() for managed_node2/command 15406 1726854951.75586: worker is 1 (out of 1 available) 15406 1726854951.75603: exiting _queue_task() for managed_node2/command 15406 1726854951.75614: done queuing things up, now waiting for results queue to drain 15406 1726854951.75615: waiting for pending results... 15406 1726854951.75793: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15406 1726854951.75916: in run() - task 0affcc66-ac2b-3c83-32d3-000000000274 15406 1726854951.75936: variable 'ansible_search_path' from source: unknown 15406 1726854951.75943: variable 'ansible_search_path' from source: unknown 15406 1726854951.75980: calling self._execute() 15406 1726854951.76077: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.76090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.76105: variable 'omit' from source: magic vars 15406 1726854951.76677: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.76698: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.76825: variable 'profile_stat' from source: set_fact 15406 1726854951.76846: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854951.76853: when evaluation is False, skipping this task 15406 1726854951.76860: _execute() done 15406 1726854951.76866: dumping result to json 15406 1726854951.76873: done dumping result, returning 15406 1726854951.76889: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000274] 15406 1726854951.76900: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000274 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854951.77039: no more pending results, returning what we have 15406 1726854951.77042: results queue empty 15406 1726854951.77044: checking for any_errors_fatal 15406 1726854951.77051: done checking for any_errors_fatal 15406 1726854951.77052: checking for max_fail_percentage 15406 1726854951.77054: done checking for max_fail_percentage 15406 1726854951.77054: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.77055: done checking to see if all hosts have failed 15406 1726854951.77056: getting the remaining hosts for this loop 15406 1726854951.77057: done getting the remaining hosts for this loop 15406 1726854951.77060: getting the next task for host managed_node2 15406 1726854951.77069: done getting next task for host managed_node2 15406 1726854951.77071: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15406 1726854951.77074: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.77078: getting variables 15406 1726854951.77079: in VariableManager get_vars() 15406 1726854951.77111: Calling all_inventory to load vars for managed_node2 15406 1726854951.77113: Calling groups_inventory to load vars for managed_node2 15406 1726854951.77117: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.77130: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.77132: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.77136: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.77656: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000274 15406 1726854951.77659: WORKER PROCESS EXITING 15406 1726854951.78683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.80268: done with get_vars() 15406 1726854951.80294: done getting variables 15406 1726854951.80359: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.80475: variable 'profile' from source: play vars 15406 1726854951.80479: variable 'interface' from source: set_fact 15406 1726854951.80540: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:55:51 -0400 (0:00:00.052) 0:00:19.628 ****** 15406 1726854951.80577: entering _queue_task() for managed_node2/set_fact 15406 1726854951.81004: worker is 1 (out of 1 available) 15406 1726854951.81016: exiting _queue_task() for managed_node2/set_fact 15406 1726854951.81026: done queuing things up, now waiting for results queue to drain 15406 1726854951.81028: waiting for pending results... 15406 1726854951.81219: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15406 1726854951.81409: in run() - task 0affcc66-ac2b-3c83-32d3-000000000275 15406 1726854951.81415: variable 'ansible_search_path' from source: unknown 15406 1726854951.81418: variable 'ansible_search_path' from source: unknown 15406 1726854951.81432: calling self._execute() 15406 1726854951.81535: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.81581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.81584: variable 'omit' from source: magic vars 15406 1726854951.81951: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.81974: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.82106: variable 'profile_stat' from source: set_fact 15406 1726854951.82131: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854951.82190: when evaluation is False, skipping this task 15406 1726854951.82194: _execute() done 15406 1726854951.82196: dumping result to json 15406 1726854951.82199: done dumping result, returning 15406 1726854951.82201: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000275] 15406 1726854951.82204: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000275 15406 1726854951.82267: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000275 15406 1726854951.82270: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854951.82330: no more pending results, returning what we have 15406 1726854951.82336: results queue empty 15406 1726854951.82337: checking for any_errors_fatal 15406 1726854951.82343: done checking for any_errors_fatal 15406 1726854951.82344: checking for max_fail_percentage 15406 1726854951.82346: done checking for max_fail_percentage 15406 1726854951.82347: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.82348: done checking to see if all hosts have failed 15406 1726854951.82348: getting the remaining hosts for this loop 15406 1726854951.82349: done getting the remaining hosts for this loop 15406 1726854951.82353: getting the next task for host managed_node2 15406 1726854951.82362: done getting next task for host managed_node2 15406 1726854951.82364: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15406 1726854951.82367: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.82371: getting variables 15406 1726854951.82372: in VariableManager get_vars() 15406 1726854951.82401: Calling all_inventory to load vars for managed_node2 15406 1726854951.82404: Calling groups_inventory to load vars for managed_node2 15406 1726854951.82408: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.82421: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.82424: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.82426: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.84009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.85649: done with get_vars() 15406 1726854951.85675: done getting variables 15406 1726854951.85747: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.85874: variable 'profile' from source: play vars 15406 1726854951.85878: variable 'interface' from source: set_fact 15406 1726854951.85942: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:55:51 -0400 (0:00:00.053) 0:00:19.682 ****** 15406 1726854951.85978: entering _queue_task() for managed_node2/assert 15406 1726854951.86333: worker is 1 (out of 1 available) 15406 1726854951.86346: exiting _queue_task() for managed_node2/assert 15406 1726854951.86359: done queuing things up, now waiting for results queue to drain 15406 1726854951.86360: waiting for pending results... 15406 1726854951.86711: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'LSR-TST-br31' 15406 1726854951.86752: in run() - task 0affcc66-ac2b-3c83-32d3-000000000260 15406 1726854951.86772: variable 'ansible_search_path' from source: unknown 15406 1726854951.86792: variable 'ansible_search_path' from source: unknown 15406 1726854951.86830: calling self._execute() 15406 1726854951.86936: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.86992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.86995: variable 'omit' from source: magic vars 15406 1726854951.87348: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.87375: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.87386: variable 'omit' from source: magic vars 15406 1726854951.87428: variable 'omit' from source: magic vars 15406 1726854951.87534: variable 'profile' from source: play vars 15406 1726854951.87542: variable 'interface' from source: set_fact 15406 1726854951.87608: variable 'interface' from source: set_fact 15406 1726854951.87633: variable 'omit' from source: magic vars 15406 1726854951.87676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854951.87728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854951.87794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854951.87798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.87805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.87835: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854951.87844: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.87852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.87953: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854951.87963: Set connection var ansible_timeout to 10 15406 1726854951.87991: Set connection var ansible_connection to ssh 15406 1726854951.87994: Set connection var ansible_shell_type to sh 15406 1726854951.87997: Set connection var ansible_shell_executable to /bin/sh 15406 1726854951.87999: Set connection var ansible_pipelining to False 15406 1726854951.88028: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.88035: variable 'ansible_connection' from source: unknown 15406 1726854951.88119: variable 'ansible_module_compression' from source: unknown 15406 1726854951.88121: variable 'ansible_shell_type' from source: unknown 15406 1726854951.88123: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.88126: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.88128: variable 'ansible_pipelining' from source: unknown 15406 1726854951.88130: variable 'ansible_timeout' from source: unknown 15406 1726854951.88132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.88208: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854951.88227: variable 'omit' from source: magic vars 15406 1726854951.88238: starting attempt loop 15406 1726854951.88249: running the handler 15406 1726854951.88364: variable 'lsr_net_profile_exists' from source: set_fact 15406 1726854951.88391: Evaluated conditional (lsr_net_profile_exists): True 15406 1726854951.88394: handler run complete 15406 1726854951.88400: attempt loop complete, returning result 15406 1726854951.88406: _execute() done 15406 1726854951.88412: dumping result to json 15406 1726854951.88445: done dumping result, returning 15406 1726854951.88447: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'LSR-TST-br31' [0affcc66-ac2b-3c83-32d3-000000000260] 15406 1726854951.88450: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000260 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854951.88684: no more pending results, returning what we have 15406 1726854951.88689: results queue empty 15406 1726854951.88690: checking for any_errors_fatal 15406 1726854951.88699: done checking for any_errors_fatal 15406 1726854951.88699: checking for max_fail_percentage 15406 1726854951.88702: done checking for max_fail_percentage 15406 1726854951.88702: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.88707: done checking to see if all hosts have failed 15406 1726854951.88708: getting the remaining hosts for this loop 15406 1726854951.88709: done getting the remaining hosts for this loop 15406 1726854951.88713: getting the next task for host managed_node2 15406 1726854951.88719: done getting next task for host managed_node2 15406 1726854951.88722: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15406 1726854951.88725: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.88729: getting variables 15406 1726854951.88731: in VariableManager get_vars() 15406 1726854951.88759: Calling all_inventory to load vars for managed_node2 15406 1726854951.88762: Calling groups_inventory to load vars for managed_node2 15406 1726854951.88766: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.88778: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.88781: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.88784: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.89402: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000260 15406 1726854951.89406: WORKER PROCESS EXITING 15406 1726854951.90478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.92076: done with get_vars() 15406 1726854951.92099: done getting variables 15406 1726854951.92163: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.92275: variable 'profile' from source: play vars 15406 1726854951.92278: variable 'interface' from source: set_fact 15406 1726854951.92343: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:55:51 -0400 (0:00:00.063) 0:00:19.746 ****** 15406 1726854951.92380: entering _queue_task() for managed_node2/assert 15406 1726854951.92912: worker is 1 (out of 1 available) 15406 1726854951.92920: exiting _queue_task() for managed_node2/assert 15406 1726854951.92930: done queuing things up, now waiting for results queue to drain 15406 1726854951.92931: waiting for pending results... 15406 1726854951.92972: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15406 1726854951.93086: in run() - task 0affcc66-ac2b-3c83-32d3-000000000261 15406 1726854951.93107: variable 'ansible_search_path' from source: unknown 15406 1726854951.93113: variable 'ansible_search_path' from source: unknown 15406 1726854951.93157: calling self._execute() 15406 1726854951.93246: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.93256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.93279: variable 'omit' from source: magic vars 15406 1726854951.93637: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.93653: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.93662: variable 'omit' from source: magic vars 15406 1726854951.93715: variable 'omit' from source: magic vars 15406 1726854951.93825: variable 'profile' from source: play vars 15406 1726854951.93835: variable 'interface' from source: set_fact 15406 1726854951.93901: variable 'interface' from source: set_fact 15406 1726854951.93933: variable 'omit' from source: magic vars 15406 1726854951.93978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854951.94034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854951.94060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854951.94086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.94107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854951.94147: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854951.94192: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.94196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.94269: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854951.94281: Set connection var ansible_timeout to 10 15406 1726854951.94289: Set connection var ansible_connection to ssh 15406 1726854951.94299: Set connection var ansible_shell_type to sh 15406 1726854951.94307: Set connection var ansible_shell_executable to /bin/sh 15406 1726854951.94318: Set connection var ansible_pipelining to False 15406 1726854951.94358: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.94361: variable 'ansible_connection' from source: unknown 15406 1726854951.94363: variable 'ansible_module_compression' from source: unknown 15406 1726854951.94365: variable 'ansible_shell_type' from source: unknown 15406 1726854951.94467: variable 'ansible_shell_executable' from source: unknown 15406 1726854951.94470: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.94472: variable 'ansible_pipelining' from source: unknown 15406 1726854951.94475: variable 'ansible_timeout' from source: unknown 15406 1726854951.94477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.94532: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854951.94547: variable 'omit' from source: magic vars 15406 1726854951.94557: starting attempt loop 15406 1726854951.94563: running the handler 15406 1726854951.94674: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15406 1726854951.94693: Evaluated conditional (lsr_net_profile_ansible_managed): True 15406 1726854951.94702: handler run complete 15406 1726854951.94719: attempt loop complete, returning result 15406 1726854951.94725: _execute() done 15406 1726854951.94731: dumping result to json 15406 1726854951.94738: done dumping result, returning 15406 1726854951.94747: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [0affcc66-ac2b-3c83-32d3-000000000261] 15406 1726854951.94755: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000261 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854951.94943: no more pending results, returning what we have 15406 1726854951.94947: results queue empty 15406 1726854951.94948: checking for any_errors_fatal 15406 1726854951.94954: done checking for any_errors_fatal 15406 1726854951.94955: checking for max_fail_percentage 15406 1726854951.94957: done checking for max_fail_percentage 15406 1726854951.94958: checking to see if all hosts have failed and the running result is not ok 15406 1726854951.94959: done checking to see if all hosts have failed 15406 1726854951.94960: getting the remaining hosts for this loop 15406 1726854951.94962: done getting the remaining hosts for this loop 15406 1726854951.94965: getting the next task for host managed_node2 15406 1726854951.94971: done getting next task for host managed_node2 15406 1726854951.94974: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15406 1726854951.94977: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854951.94981: getting variables 15406 1726854951.94983: in VariableManager get_vars() 15406 1726854951.95099: Calling all_inventory to load vars for managed_node2 15406 1726854951.95102: Calling groups_inventory to load vars for managed_node2 15406 1726854951.95106: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854951.95120: Calling all_plugins_play to load vars for managed_node2 15406 1726854951.95123: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854951.95126: Calling groups_plugins_play to load vars for managed_node2 15406 1726854951.95700: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000261 15406 1726854951.95704: WORKER PROCESS EXITING 15406 1726854951.96566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854951.98262: done with get_vars() 15406 1726854951.98286: done getting variables 15406 1726854951.98337: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854951.98438: variable 'profile' from source: play vars 15406 1726854951.98441: variable 'interface' from source: set_fact 15406 1726854951.98498: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:55:51 -0400 (0:00:00.061) 0:00:19.808 ****** 15406 1726854951.98529: entering _queue_task() for managed_node2/assert 15406 1726854951.98786: worker is 1 (out of 1 available) 15406 1726854951.98802: exiting _queue_task() for managed_node2/assert 15406 1726854951.98900: done queuing things up, now waiting for results queue to drain 15406 1726854951.98902: waiting for pending results... 15406 1726854951.99093: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15406 1726854951.99213: in run() - task 0affcc66-ac2b-3c83-32d3-000000000262 15406 1726854951.99234: variable 'ansible_search_path' from source: unknown 15406 1726854951.99249: variable 'ansible_search_path' from source: unknown 15406 1726854951.99290: calling self._execute() 15406 1726854951.99385: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854951.99402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854951.99418: variable 'omit' from source: magic vars 15406 1726854951.99993: variable 'ansible_distribution_major_version' from source: facts 15406 1726854951.99996: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854951.99998: variable 'omit' from source: magic vars 15406 1726854952.00000: variable 'omit' from source: magic vars 15406 1726854952.00002: variable 'profile' from source: play vars 15406 1726854952.00005: variable 'interface' from source: set_fact 15406 1726854952.00034: variable 'interface' from source: set_fact 15406 1726854952.00057: variable 'omit' from source: magic vars 15406 1726854952.00100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854952.00142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854952.00162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854952.00179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854952.00196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854952.00234: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854952.00242: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854952.00248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854952.00349: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854952.00360: Set connection var ansible_timeout to 10 15406 1726854952.00366: Set connection var ansible_connection to ssh 15406 1726854952.00375: Set connection var ansible_shell_type to sh 15406 1726854952.00383: Set connection var ansible_shell_executable to /bin/sh 15406 1726854952.00397: Set connection var ansible_pipelining to False 15406 1726854952.00423: variable 'ansible_shell_executable' from source: unknown 15406 1726854952.00430: variable 'ansible_connection' from source: unknown 15406 1726854952.00436: variable 'ansible_module_compression' from source: unknown 15406 1726854952.00449: variable 'ansible_shell_type' from source: unknown 15406 1726854952.00455: variable 'ansible_shell_executable' from source: unknown 15406 1726854952.00461: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854952.00468: variable 'ansible_pipelining' from source: unknown 15406 1726854952.00474: variable 'ansible_timeout' from source: unknown 15406 1726854952.00481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854952.00622: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854952.00665: variable 'omit' from source: magic vars 15406 1726854952.00668: starting attempt loop 15406 1726854952.00670: running the handler 15406 1726854952.00761: variable 'lsr_net_profile_fingerprint' from source: set_fact 15406 1726854952.00777: Evaluated conditional (lsr_net_profile_fingerprint): True 15406 1726854952.00786: handler run complete 15406 1726854952.00883: attempt loop complete, returning result 15406 1726854952.00886: _execute() done 15406 1726854952.00891: dumping result to json 15406 1726854952.00893: done dumping result, returning 15406 1726854952.00895: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000262] 15406 1726854952.00897: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000262 15406 1726854952.00958: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000262 15406 1726854952.00961: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854952.01032: no more pending results, returning what we have 15406 1726854952.01035: results queue empty 15406 1726854952.01036: checking for any_errors_fatal 15406 1726854952.01045: done checking for any_errors_fatal 15406 1726854952.01046: checking for max_fail_percentage 15406 1726854952.01048: done checking for max_fail_percentage 15406 1726854952.01050: checking to see if all hosts have failed and the running result is not ok 15406 1726854952.01051: done checking to see if all hosts have failed 15406 1726854952.01052: getting the remaining hosts for this loop 15406 1726854952.01053: done getting the remaining hosts for this loop 15406 1726854952.01056: getting the next task for host managed_node2 15406 1726854952.01064: done getting next task for host managed_node2 15406 1726854952.01067: ^ task is: TASK: meta (flush_handlers) 15406 1726854952.01068: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854952.01072: getting variables 15406 1726854952.01074: in VariableManager get_vars() 15406 1726854952.01105: Calling all_inventory to load vars for managed_node2 15406 1726854952.01107: Calling groups_inventory to load vars for managed_node2 15406 1726854952.01112: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854952.01122: Calling all_plugins_play to load vars for managed_node2 15406 1726854952.01125: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854952.01128: Calling groups_plugins_play to load vars for managed_node2 15406 1726854952.02628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854952.04229: done with get_vars() 15406 1726854952.04252: done getting variables 15406 1726854952.04318: in VariableManager get_vars() 15406 1726854952.04327: Calling all_inventory to load vars for managed_node2 15406 1726854952.04334: Calling groups_inventory to load vars for managed_node2 15406 1726854952.04337: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854952.04341: Calling all_plugins_play to load vars for managed_node2 15406 1726854952.04343: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854952.04346: Calling groups_plugins_play to load vars for managed_node2 15406 1726854952.05621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854952.07211: done with get_vars() 15406 1726854952.07244: done queuing things up, now waiting for results queue to drain 15406 1726854952.07247: results queue empty 15406 1726854952.07247: checking for any_errors_fatal 15406 1726854952.07250: done checking for any_errors_fatal 15406 1726854952.07251: checking for max_fail_percentage 15406 1726854952.07252: done checking for max_fail_percentage 15406 1726854952.07263: checking to see if all hosts have failed and the running result is not ok 15406 1726854952.07265: done checking to see if all hosts have failed 15406 1726854952.07265: getting the remaining hosts for this loop 15406 1726854952.07267: done getting the remaining hosts for this loop 15406 1726854952.07270: getting the next task for host managed_node2 15406 1726854952.07274: done getting next task for host managed_node2 15406 1726854952.07276: ^ task is: TASK: meta (flush_handlers) 15406 1726854952.07277: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854952.07280: getting variables 15406 1726854952.07281: in VariableManager get_vars() 15406 1726854952.07292: Calling all_inventory to load vars for managed_node2 15406 1726854952.07294: Calling groups_inventory to load vars for managed_node2 15406 1726854952.07297: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854952.07302: Calling all_plugins_play to load vars for managed_node2 15406 1726854952.07304: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854952.07307: Calling groups_plugins_play to load vars for managed_node2 15406 1726854952.08465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854952.10034: done with get_vars() 15406 1726854952.10054: done getting variables 15406 1726854952.10106: in VariableManager get_vars() 15406 1726854952.10116: Calling all_inventory to load vars for managed_node2 15406 1726854952.10118: Calling groups_inventory to load vars for managed_node2 15406 1726854952.10120: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854952.10125: Calling all_plugins_play to load vars for managed_node2 15406 1726854952.10127: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854952.10130: Calling groups_plugins_play to load vars for managed_node2 15406 1726854952.11353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854952.12919: done with get_vars() 15406 1726854952.12944: done queuing things up, now waiting for results queue to drain 15406 1726854952.12946: results queue empty 15406 1726854952.12947: checking for any_errors_fatal 15406 1726854952.12948: done checking for any_errors_fatal 15406 1726854952.12949: checking for max_fail_percentage 15406 1726854952.12950: done checking for max_fail_percentage 15406 1726854952.12951: checking to see if all hosts have failed and the running result is not ok 15406 1726854952.12952: done checking to see if all hosts have failed 15406 1726854952.12952: getting the remaining hosts for this loop 15406 1726854952.12953: done getting the remaining hosts for this loop 15406 1726854952.12956: getting the next task for host managed_node2 15406 1726854952.12959: done getting next task for host managed_node2 15406 1726854952.12960: ^ task is: None 15406 1726854952.12962: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854952.12963: done queuing things up, now waiting for results queue to drain 15406 1726854952.12964: results queue empty 15406 1726854952.12964: checking for any_errors_fatal 15406 1726854952.12965: done checking for any_errors_fatal 15406 1726854952.12966: checking for max_fail_percentage 15406 1726854952.12967: done checking for max_fail_percentage 15406 1726854952.12968: checking to see if all hosts have failed and the running result is not ok 15406 1726854952.12968: done checking to see if all hosts have failed 15406 1726854952.12970: getting the next task for host managed_node2 15406 1726854952.12972: done getting next task for host managed_node2 15406 1726854952.12973: ^ task is: None 15406 1726854952.12974: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854952.13025: in VariableManager get_vars() 15406 1726854952.13049: done with get_vars() 15406 1726854952.13055: in VariableManager get_vars() 15406 1726854952.13068: done with get_vars() 15406 1726854952.13072: variable 'omit' from source: magic vars 15406 1726854952.13184: variable 'profile' from source: play vars 15406 1726854952.13311: in VariableManager get_vars() 15406 1726854952.13324: done with get_vars() 15406 1726854952.13347: variable 'omit' from source: magic vars 15406 1726854952.13409: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15406 1726854952.14241: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854952.14262: getting the remaining hosts for this loop 15406 1726854952.14264: done getting the remaining hosts for this loop 15406 1726854952.14266: getting the next task for host managed_node2 15406 1726854952.14269: done getting next task for host managed_node2 15406 1726854952.14271: ^ task is: TASK: Gathering Facts 15406 1726854952.14272: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854952.14274: getting variables 15406 1726854952.14275: in VariableManager get_vars() 15406 1726854952.14289: Calling all_inventory to load vars for managed_node2 15406 1726854952.14291: Calling groups_inventory to load vars for managed_node2 15406 1726854952.14293: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854952.14299: Calling all_plugins_play to load vars for managed_node2 15406 1726854952.14301: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854952.14304: Calling groups_plugins_play to load vars for managed_node2 15406 1726854952.15521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854952.17339: done with get_vars() 15406 1726854952.17365: done getting variables 15406 1726854952.17417: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:55:52 -0400 (0:00:00.189) 0:00:19.997 ****** 15406 1726854952.17442: entering _queue_task() for managed_node2/gather_facts 15406 1726854952.17834: worker is 1 (out of 1 available) 15406 1726854952.17845: exiting _queue_task() for managed_node2/gather_facts 15406 1726854952.17856: done queuing things up, now waiting for results queue to drain 15406 1726854952.17857: waiting for pending results... 15406 1726854952.18085: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854952.18240: in run() - task 0affcc66-ac2b-3c83-32d3-0000000002b5 15406 1726854952.18244: variable 'ansible_search_path' from source: unknown 15406 1726854952.18254: calling self._execute() 15406 1726854952.18359: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854952.18372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854952.18388: variable 'omit' from source: magic vars 15406 1726854952.18788: variable 'ansible_distribution_major_version' from source: facts 15406 1726854952.18897: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854952.18901: variable 'omit' from source: magic vars 15406 1726854952.18903: variable 'omit' from source: magic vars 15406 1726854952.18905: variable 'omit' from source: magic vars 15406 1726854952.18928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854952.18972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854952.19004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854952.19030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854952.19048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854952.19079: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854952.19118: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854952.19125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854952.19257: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854952.19269: Set connection var ansible_timeout to 10 15406 1726854952.19276: Set connection var ansible_connection to ssh 15406 1726854952.19285: Set connection var ansible_shell_type to sh 15406 1726854952.19297: Set connection var ansible_shell_executable to /bin/sh 15406 1726854952.19310: Set connection var ansible_pipelining to False 15406 1726854952.19347: variable 'ansible_shell_executable' from source: unknown 15406 1726854952.19441: variable 'ansible_connection' from source: unknown 15406 1726854952.19445: variable 'ansible_module_compression' from source: unknown 15406 1726854952.19447: variable 'ansible_shell_type' from source: unknown 15406 1726854952.19450: variable 'ansible_shell_executable' from source: unknown 15406 1726854952.19452: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854952.19454: variable 'ansible_pipelining' from source: unknown 15406 1726854952.19457: variable 'ansible_timeout' from source: unknown 15406 1726854952.19459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854952.19773: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854952.19778: variable 'omit' from source: magic vars 15406 1726854952.19780: starting attempt loop 15406 1726854952.19783: running the handler 15406 1726854952.19785: variable 'ansible_facts' from source: unknown 15406 1726854952.19802: _low_level_execute_command(): starting 15406 1726854952.19814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854952.20608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854952.20683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854952.20878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854952.20978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854952.22810: stdout chunk (state=3): >>>/root <<< 15406 1726854952.22934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854952.22948: stderr chunk (state=3): >>><<< 15406 1726854952.22959: stdout chunk (state=3): >>><<< 15406 1726854952.22984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854952.23008: _low_level_execute_command(): starting 15406 1726854952.23018: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841 `" && echo ansible-tmp-1726854952.2299523-16354-130958837560841="` echo /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841 `" ) && sleep 0' 15406 1726854952.23666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854952.23682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854952.23702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854952.23723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854952.23744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854952.23758: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854952.23805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854952.23868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854952.23917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854952.24005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854952.25971: stdout chunk (state=3): >>>ansible-tmp-1726854952.2299523-16354-130958837560841=/root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841 <<< 15406 1726854952.26108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854952.26139: stderr chunk (state=3): >>><<< 15406 1726854952.26155: stdout chunk (state=3): >>><<< 15406 1726854952.26395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854952.2299523-16354-130958837560841=/root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854952.26399: variable 'ansible_module_compression' from source: unknown 15406 1726854952.26403: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854952.26405: variable 'ansible_facts' from source: unknown 15406 1726854952.26561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py 15406 1726854952.26757: Sending initial data 15406 1726854952.26766: Sent initial data (154 bytes) 15406 1726854952.27376: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854952.27401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854952.27419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854952.27438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854952.27513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854952.27556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854952.27581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854952.27614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854952.27731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854952.29299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854952.29396: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854952.29461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpr_y5d_mq /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py <<< 15406 1726854952.29485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py" <<< 15406 1726854952.29559: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpr_y5d_mq" to remote "/root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py" <<< 15406 1726854952.31214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854952.31352: stderr chunk (state=3): >>><<< 15406 1726854952.31356: stdout chunk (state=3): >>><<< 15406 1726854952.31358: done transferring module to remote 15406 1726854952.31360: _low_level_execute_command(): starting 15406 1726854952.31362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/ /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py && sleep 0' 15406 1726854952.31921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854952.31940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854952.32049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854952.33867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854952.33919: stderr chunk (state=3): >>><<< 15406 1726854952.33939: stdout chunk (state=3): >>><<< 15406 1726854952.33963: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854952.33972: _low_level_execute_command(): starting 15406 1726854952.33983: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/AnsiballZ_setup.py && sleep 0' 15406 1726854952.34645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854952.34660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854952.34674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854952.34760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854952.34763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854952.34808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854952.34826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854952.34855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854952.34974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854952.97667: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_loadavg": {"1m": 0.38916015625, "5m": 0.3544921875, "15m": 0.1787109375}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "supp<<< 15406 1726854952.97726: stdout chunk (state=3): >>>ort_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 735, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797191680, "block_size": 4096, "block_total": 65519099, "block_available": 63915330, "block_used": 1603769, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "46:8a:ac:06:3d:16", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "52", "epoch": "1726854952", "epoch_int": "1726854952", "date": "2024-09-20", "time": "13:55:52", "iso8601_micro": "2024-09-20T17:55:52.972045Z", "iso8601": "2024-09-20T17:55:52Z", "iso8601_basic": "20240920T135552972045", "iso8601_basic_short": "20240920T135552", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854952.99893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854952.99897: stdout chunk (state=3): >>><<< 15406 1726854952.99900: stderr chunk (state=3): >>><<< 15406 1726854952.99905: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_loadavg": {"1m": 0.38916015625, "5m": 0.3544921875, "15m": 0.1787109375}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 735, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797191680, "block_size": 4096, "block_total": 65519099, "block_available": 63915330, "block_used": 1603769, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "46:8a:ac:06:3d:16", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "55", "second": "52", "epoch": "1726854952", "epoch_int": "1726854952", "date": "2024-09-20", "time": "13:55:52", "iso8601_micro": "2024-09-20T17:55:52.972045Z", "iso8601": "2024-09-20T17:55:52Z", "iso8601_basic": "20240920T135552972045", "iso8601_basic_short": "20240920T135552", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854953.00423: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854953.00461: _low_level_execute_command(): starting 15406 1726854953.00537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854952.2299523-16354-130958837560841/ > /dev/null 2>&1 && sleep 0' 15406 1726854953.01071: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854953.01085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854953.01107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854953.01211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854953.01244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854953.01247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854953.01337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854953.03217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854953.03227: stdout chunk (state=3): >>><<< 15406 1726854953.03238: stderr chunk (state=3): >>><<< 15406 1726854953.03258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854953.03270: handler run complete 15406 1726854953.03426: variable 'ansible_facts' from source: unknown 15406 1726854953.03595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.03902: variable 'ansible_facts' from source: unknown 15406 1726854953.04003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.04155: attempt loop complete, returning result 15406 1726854953.04165: _execute() done 15406 1726854953.04171: dumping result to json 15406 1726854953.04216: done dumping result, returning 15406 1726854953.04228: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-0000000002b5] 15406 1726854953.04237: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002b5 15406 1726854953.05049: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002b5 15406 1726854953.05052: WORKER PROCESS EXITING ok: [managed_node2] 15406 1726854953.05428: no more pending results, returning what we have 15406 1726854953.05432: results queue empty 15406 1726854953.05433: checking for any_errors_fatal 15406 1726854953.05434: done checking for any_errors_fatal 15406 1726854953.05435: checking for max_fail_percentage 15406 1726854953.05436: done checking for max_fail_percentage 15406 1726854953.05437: checking to see if all hosts have failed and the running result is not ok 15406 1726854953.05438: done checking to see if all hosts have failed 15406 1726854953.05439: getting the remaining hosts for this loop 15406 1726854953.05440: done getting the remaining hosts for this loop 15406 1726854953.05443: getting the next task for host managed_node2 15406 1726854953.05448: done getting next task for host managed_node2 15406 1726854953.05449: ^ task is: TASK: meta (flush_handlers) 15406 1726854953.05451: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854953.05455: getting variables 15406 1726854953.05456: in VariableManager get_vars() 15406 1726854953.05484: Calling all_inventory to load vars for managed_node2 15406 1726854953.05489: Calling groups_inventory to load vars for managed_node2 15406 1726854953.05491: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.05503: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.05505: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.05507: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.06769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.08395: done with get_vars() 15406 1726854953.08421: done getting variables 15406 1726854953.08492: in VariableManager get_vars() 15406 1726854953.08508: Calling all_inventory to load vars for managed_node2 15406 1726854953.08510: Calling groups_inventory to load vars for managed_node2 15406 1726854953.08512: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.08518: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.08520: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.08523: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.13370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.14708: done with get_vars() 15406 1726854953.14727: done queuing things up, now waiting for results queue to drain 15406 1726854953.14729: results queue empty 15406 1726854953.14729: checking for any_errors_fatal 15406 1726854953.14731: done checking for any_errors_fatal 15406 1726854953.14732: checking for max_fail_percentage 15406 1726854953.14732: done checking for max_fail_percentage 15406 1726854953.14736: checking to see if all hosts have failed and the running result is not ok 15406 1726854953.14737: done checking to see if all hosts have failed 15406 1726854953.14737: getting the remaining hosts for this loop 15406 1726854953.14738: done getting the remaining hosts for this loop 15406 1726854953.14740: getting the next task for host managed_node2 15406 1726854953.14743: done getting next task for host managed_node2 15406 1726854953.14744: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15406 1726854953.14745: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854953.14752: getting variables 15406 1726854953.14753: in VariableManager get_vars() 15406 1726854953.14762: Calling all_inventory to load vars for managed_node2 15406 1726854953.14763: Calling groups_inventory to load vars for managed_node2 15406 1726854953.14764: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.14768: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.14770: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.14772: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.15392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.16235: done with get_vars() 15406 1726854953.16249: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:55:53 -0400 (0:00:00.988) 0:00:20.985 ****** 15406 1726854953.16304: entering _queue_task() for managed_node2/include_tasks 15406 1726854953.16649: worker is 1 (out of 1 available) 15406 1726854953.16663: exiting _queue_task() for managed_node2/include_tasks 15406 1726854953.16675: done queuing things up, now waiting for results queue to drain 15406 1726854953.16676: waiting for pending results... 15406 1726854953.16901: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15406 1726854953.17013: in run() - task 0affcc66-ac2b-3c83-32d3-00000000003a 15406 1726854953.17041: variable 'ansible_search_path' from source: unknown 15406 1726854953.17108: variable 'ansible_search_path' from source: unknown 15406 1726854953.17110: calling self._execute() 15406 1726854953.17214: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.17227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.17247: variable 'omit' from source: magic vars 15406 1726854953.17647: variable 'ansible_distribution_major_version' from source: facts 15406 1726854953.17663: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854953.17673: _execute() done 15406 1726854953.17685: dumping result to json 15406 1726854953.17699: done dumping result, returning 15406 1726854953.17795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-3c83-32d3-00000000003a] 15406 1726854953.17799: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003a 15406 1726854953.17871: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003a 15406 1726854953.17874: WORKER PROCESS EXITING 15406 1726854953.17937: no more pending results, returning what we have 15406 1726854953.17943: in VariableManager get_vars() 15406 1726854953.17984: Calling all_inventory to load vars for managed_node2 15406 1726854953.17989: Calling groups_inventory to load vars for managed_node2 15406 1726854953.17995: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.18009: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.18013: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.18017: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.18975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.19859: done with get_vars() 15406 1726854953.19877: variable 'ansible_search_path' from source: unknown 15406 1726854953.19879: variable 'ansible_search_path' from source: unknown 15406 1726854953.19909: we have included files to process 15406 1726854953.19910: generating all_blocks data 15406 1726854953.19911: done generating all_blocks data 15406 1726854953.19911: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854953.19912: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854953.19914: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854953.20480: done processing included file 15406 1726854953.20482: iterating over new_blocks loaded from include file 15406 1726854953.20483: in VariableManager get_vars() 15406 1726854953.20554: done with get_vars() 15406 1726854953.20556: filtering new block on tags 15406 1726854953.20571: done filtering new block on tags 15406 1726854953.20574: in VariableManager get_vars() 15406 1726854953.20596: done with get_vars() 15406 1726854953.20598: filtering new block on tags 15406 1726854953.20622: done filtering new block on tags 15406 1726854953.20625: in VariableManager get_vars() 15406 1726854953.20645: done with get_vars() 15406 1726854953.20647: filtering new block on tags 15406 1726854953.20663: done filtering new block on tags 15406 1726854953.20665: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 15406 1726854953.20669: extending task lists for all hosts with included blocks 15406 1726854953.21051: done extending task lists 15406 1726854953.21053: done processing included files 15406 1726854953.21057: results queue empty 15406 1726854953.21058: checking for any_errors_fatal 15406 1726854953.21060: done checking for any_errors_fatal 15406 1726854953.21061: checking for max_fail_percentage 15406 1726854953.21062: done checking for max_fail_percentage 15406 1726854953.21062: checking to see if all hosts have failed and the running result is not ok 15406 1726854953.21063: done checking to see if all hosts have failed 15406 1726854953.21064: getting the remaining hosts for this loop 15406 1726854953.21065: done getting the remaining hosts for this loop 15406 1726854953.21067: getting the next task for host managed_node2 15406 1726854953.21070: done getting next task for host managed_node2 15406 1726854953.21073: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15406 1726854953.21075: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854953.21084: getting variables 15406 1726854953.21085: in VariableManager get_vars() 15406 1726854953.21101: Calling all_inventory to load vars for managed_node2 15406 1726854953.21103: Calling groups_inventory to load vars for managed_node2 15406 1726854953.21104: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.21108: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.21109: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.21111: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.22708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.24403: done with get_vars() 15406 1726854953.24432: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:55:53 -0400 (0:00:00.083) 0:00:21.068 ****** 15406 1726854953.24617: entering _queue_task() for managed_node2/setup 15406 1726854953.25130: worker is 1 (out of 1 available) 15406 1726854953.25142: exiting _queue_task() for managed_node2/setup 15406 1726854953.25153: done queuing things up, now waiting for results queue to drain 15406 1726854953.25155: waiting for pending results... 15406 1726854953.25968: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15406 1726854953.26115: in run() - task 0affcc66-ac2b-3c83-32d3-0000000002f6 15406 1726854953.26138: variable 'ansible_search_path' from source: unknown 15406 1726854953.26175: variable 'ansible_search_path' from source: unknown 15406 1726854953.26203: calling self._execute() 15406 1726854953.26321: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.26394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.26398: variable 'omit' from source: magic vars 15406 1726854953.26774: variable 'ansible_distribution_major_version' from source: facts 15406 1726854953.26794: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854953.27023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854953.31176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854953.31417: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854953.31488: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854953.31540: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854953.31628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854953.31876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854953.31881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854953.32005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854953.32070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854953.32391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854953.32399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854953.32406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854953.32410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854953.32412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854953.32458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854953.32802: variable '__network_required_facts' from source: role '' defaults 15406 1726854953.32819: variable 'ansible_facts' from source: unknown 15406 1726854953.33985: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15406 1726854953.34000: when evaluation is False, skipping this task 15406 1726854953.34076: _execute() done 15406 1726854953.34079: dumping result to json 15406 1726854953.34083: done dumping result, returning 15406 1726854953.34086: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-3c83-32d3-0000000002f6] 15406 1726854953.34091: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002f6 15406 1726854953.34165: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002f6 15406 1726854953.34168: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854953.34229: no more pending results, returning what we have 15406 1726854953.34232: results queue empty 15406 1726854953.34234: checking for any_errors_fatal 15406 1726854953.34236: done checking for any_errors_fatal 15406 1726854953.34236: checking for max_fail_percentage 15406 1726854953.34238: done checking for max_fail_percentage 15406 1726854953.34239: checking to see if all hosts have failed and the running result is not ok 15406 1726854953.34240: done checking to see if all hosts have failed 15406 1726854953.34241: getting the remaining hosts for this loop 15406 1726854953.34242: done getting the remaining hosts for this loop 15406 1726854953.34246: getting the next task for host managed_node2 15406 1726854953.34255: done getting next task for host managed_node2 15406 1726854953.34259: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15406 1726854953.34262: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854953.34276: getting variables 15406 1726854953.34284: in VariableManager get_vars() 15406 1726854953.34330: Calling all_inventory to load vars for managed_node2 15406 1726854953.34333: Calling groups_inventory to load vars for managed_node2 15406 1726854953.34336: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.34347: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.34350: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.34354: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.36149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.39655: done with get_vars() 15406 1726854953.39683: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:55:53 -0400 (0:00:00.152) 0:00:21.220 ****** 15406 1726854953.39824: entering _queue_task() for managed_node2/stat 15406 1726854953.40195: worker is 1 (out of 1 available) 15406 1726854953.40209: exiting _queue_task() for managed_node2/stat 15406 1726854953.40223: done queuing things up, now waiting for results queue to drain 15406 1726854953.40224: waiting for pending results... 15406 1726854953.40673: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15406 1726854953.40677: in run() - task 0affcc66-ac2b-3c83-32d3-0000000002f8 15406 1726854953.40680: variable 'ansible_search_path' from source: unknown 15406 1726854953.40684: variable 'ansible_search_path' from source: unknown 15406 1726854953.40689: calling self._execute() 15406 1726854953.40839: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.40844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.40847: variable 'omit' from source: magic vars 15406 1726854953.41194: variable 'ansible_distribution_major_version' from source: facts 15406 1726854953.41198: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854953.41382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854953.41626: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854953.41669: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854953.41793: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854953.41797: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854953.41859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854953.41891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854953.41920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854953.41951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854953.42037: variable '__network_is_ostree' from source: set_fact 15406 1726854953.42049: Evaluated conditional (not __network_is_ostree is defined): False 15406 1726854953.42058: when evaluation is False, skipping this task 15406 1726854953.42068: _execute() done 15406 1726854953.42080: dumping result to json 15406 1726854953.42092: done dumping result, returning 15406 1726854953.42192: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-3c83-32d3-0000000002f8] 15406 1726854953.42198: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002f8 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15406 1726854953.42325: no more pending results, returning what we have 15406 1726854953.42330: results queue empty 15406 1726854953.42331: checking for any_errors_fatal 15406 1726854953.42338: done checking for any_errors_fatal 15406 1726854953.42339: checking for max_fail_percentage 15406 1726854953.42341: done checking for max_fail_percentage 15406 1726854953.42342: checking to see if all hosts have failed and the running result is not ok 15406 1726854953.42342: done checking to see if all hosts have failed 15406 1726854953.42343: getting the remaining hosts for this loop 15406 1726854953.42345: done getting the remaining hosts for this loop 15406 1726854953.42349: getting the next task for host managed_node2 15406 1726854953.42357: done getting next task for host managed_node2 15406 1726854953.42361: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15406 1726854953.42364: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854953.42382: getting variables 15406 1726854953.42384: in VariableManager get_vars() 15406 1726854953.42432: Calling all_inventory to load vars for managed_node2 15406 1726854953.42434: Calling groups_inventory to load vars for managed_node2 15406 1726854953.42437: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.42448: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.42451: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.42454: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.43002: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002f8 15406 1726854953.43005: WORKER PROCESS EXITING 15406 1726854953.44282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.45878: done with get_vars() 15406 1726854953.45911: done getting variables 15406 1726854953.45978: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:55:53 -0400 (0:00:00.061) 0:00:21.282 ****** 15406 1726854953.46017: entering _queue_task() for managed_node2/set_fact 15406 1726854953.46386: worker is 1 (out of 1 available) 15406 1726854953.46604: exiting _queue_task() for managed_node2/set_fact 15406 1726854953.46617: done queuing things up, now waiting for results queue to drain 15406 1726854953.46618: waiting for pending results... 15406 1726854953.46724: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15406 1726854953.46886: in run() - task 0affcc66-ac2b-3c83-32d3-0000000002f9 15406 1726854953.46912: variable 'ansible_search_path' from source: unknown 15406 1726854953.46920: variable 'ansible_search_path' from source: unknown 15406 1726854953.46965: calling self._execute() 15406 1726854953.47071: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.47084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.47109: variable 'omit' from source: magic vars 15406 1726854953.47526: variable 'ansible_distribution_major_version' from source: facts 15406 1726854953.47549: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854953.47734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854953.48007: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854953.48061: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854953.48103: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854953.48141: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854953.48273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854953.48310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854953.48342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854953.48378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854953.48475: variable '__network_is_ostree' from source: set_fact 15406 1726854953.48584: Evaluated conditional (not __network_is_ostree is defined): False 15406 1726854953.48589: when evaluation is False, skipping this task 15406 1726854953.48592: _execute() done 15406 1726854953.48597: dumping result to json 15406 1726854953.48600: done dumping result, returning 15406 1726854953.48602: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-3c83-32d3-0000000002f9] 15406 1726854953.48605: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002f9 15406 1726854953.48674: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002f9 15406 1726854953.48677: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15406 1726854953.48743: no more pending results, returning what we have 15406 1726854953.48747: results queue empty 15406 1726854953.48749: checking for any_errors_fatal 15406 1726854953.48760: done checking for any_errors_fatal 15406 1726854953.48761: checking for max_fail_percentage 15406 1726854953.48763: done checking for max_fail_percentage 15406 1726854953.48763: checking to see if all hosts have failed and the running result is not ok 15406 1726854953.48765: done checking to see if all hosts have failed 15406 1726854953.48766: getting the remaining hosts for this loop 15406 1726854953.48767: done getting the remaining hosts for this loop 15406 1726854953.48771: getting the next task for host managed_node2 15406 1726854953.48780: done getting next task for host managed_node2 15406 1726854953.48785: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15406 1726854953.48790: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854953.48811: getting variables 15406 1726854953.48813: in VariableManager get_vars() 15406 1726854953.48853: Calling all_inventory to load vars for managed_node2 15406 1726854953.48856: Calling groups_inventory to load vars for managed_node2 15406 1726854953.48859: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854953.48870: Calling all_plugins_play to load vars for managed_node2 15406 1726854953.48873: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854953.48876: Calling groups_plugins_play to load vars for managed_node2 15406 1726854953.50505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854953.52168: done with get_vars() 15406 1726854953.52192: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:55:53 -0400 (0:00:00.062) 0:00:21.345 ****** 15406 1726854953.52292: entering _queue_task() for managed_node2/service_facts 15406 1726854953.52625: worker is 1 (out of 1 available) 15406 1726854953.52639: exiting _queue_task() for managed_node2/service_facts 15406 1726854953.52652: done queuing things up, now waiting for results queue to drain 15406 1726854953.52653: waiting for pending results... 15406 1726854953.53104: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 15406 1726854953.53109: in run() - task 0affcc66-ac2b-3c83-32d3-0000000002fb 15406 1726854953.53112: variable 'ansible_search_path' from source: unknown 15406 1726854953.53115: variable 'ansible_search_path' from source: unknown 15406 1726854953.53117: calling self._execute() 15406 1726854953.53213: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.53227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.53244: variable 'omit' from source: magic vars 15406 1726854953.53623: variable 'ansible_distribution_major_version' from source: facts 15406 1726854953.53639: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854953.53650: variable 'omit' from source: magic vars 15406 1726854953.53723: variable 'omit' from source: magic vars 15406 1726854953.53760: variable 'omit' from source: magic vars 15406 1726854953.53813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854953.53851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854953.53874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854953.53908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854953.53930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854953.53963: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854953.53996: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.53999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.54082: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854953.54106: Set connection var ansible_timeout to 10 15406 1726854953.54193: Set connection var ansible_connection to ssh 15406 1726854953.54198: Set connection var ansible_shell_type to sh 15406 1726854953.54201: Set connection var ansible_shell_executable to /bin/sh 15406 1726854953.54203: Set connection var ansible_pipelining to False 15406 1726854953.54207: variable 'ansible_shell_executable' from source: unknown 15406 1726854953.54212: variable 'ansible_connection' from source: unknown 15406 1726854953.54214: variable 'ansible_module_compression' from source: unknown 15406 1726854953.54216: variable 'ansible_shell_type' from source: unknown 15406 1726854953.54218: variable 'ansible_shell_executable' from source: unknown 15406 1726854953.54219: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854953.54221: variable 'ansible_pipelining' from source: unknown 15406 1726854953.54223: variable 'ansible_timeout' from source: unknown 15406 1726854953.54225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854953.54414: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854953.54436: variable 'omit' from source: magic vars 15406 1726854953.54447: starting attempt loop 15406 1726854953.54455: running the handler 15406 1726854953.54477: _low_level_execute_command(): starting 15406 1726854953.54491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854953.55293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854953.55317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854953.55413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854953.55441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854953.55613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854953.57752: stdout chunk (state=3): >>>/root <<< 15406 1726854953.57757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854953.57760: stdout chunk (state=3): >>><<< 15406 1726854953.57766: stderr chunk (state=3): >>><<< 15406 1726854953.57770: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854953.57772: _low_level_execute_command(): starting 15406 1726854953.57775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448 `" && echo ansible-tmp-1726854953.5765543-16423-83135878161448="` echo /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448 `" ) && sleep 0' 15406 1726854953.58668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854953.58697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854953.58711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15406 1726854953.58723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854953.58775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854953.58807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854953.58881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854953.60799: stdout chunk (state=3): >>>ansible-tmp-1726854953.5765543-16423-83135878161448=/root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448 <<< 15406 1726854953.60947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854953.60962: stdout chunk (state=3): >>><<< 15406 1726854953.60975: stderr chunk (state=3): >>><<< 15406 1726854953.61000: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854953.5765543-16423-83135878161448=/root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854953.61052: variable 'ansible_module_compression' from source: unknown 15406 1726854953.61198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15406 1726854953.61201: variable 'ansible_facts' from source: unknown 15406 1726854953.61246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py 15406 1726854953.61434: Sending initial data 15406 1726854953.61444: Sent initial data (161 bytes) 15406 1726854953.62025: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854953.62040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854953.62056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854953.62109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854953.62124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854953.62137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854953.62147: stderr chunk (state=3): >>>debug2: match found <<< 15406 1726854953.62182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854953.62240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854953.62284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854953.62470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854953.63997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854953.64070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854953.64151: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpw75lqwpo /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py <<< 15406 1726854953.64155: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py" <<< 15406 1726854953.64220: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpw75lqwpo" to remote "/root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py" <<< 15406 1726854953.65125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854953.65265: stderr chunk (state=3): >>><<< 15406 1726854953.65269: stdout chunk (state=3): >>><<< 15406 1726854953.65271: done transferring module to remote 15406 1726854953.65274: _low_level_execute_command(): starting 15406 1726854953.65276: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/ /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py && sleep 0' 15406 1726854953.65840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854953.65906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854953.65967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854953.65984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854953.66011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854953.66129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854953.67938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854953.67966: stderr chunk (state=3): >>><<< 15406 1726854953.67969: stdout chunk (state=3): >>><<< 15406 1726854953.68063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854953.68068: _low_level_execute_command(): starting 15406 1726854953.68070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/AnsiballZ_service_facts.py && sleep 0' 15406 1726854953.68924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854953.69044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854953.69091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854953.69206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854953.69310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854955.19931: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 15406 1726854955.20102: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15406 1726854955.20116: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15406 1726854955.21582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854955.21601: stdout chunk (state=3): >>><<< 15406 1726854955.21624: stderr chunk (state=3): >>><<< 15406 1726854955.21686: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854955.22773: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854955.22794: _low_level_execute_command(): starting 15406 1726854955.22801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854953.5765543-16423-83135878161448/ > /dev/null 2>&1 && sleep 0' 15406 1726854955.23686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854955.23692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854955.23741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854955.23819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.23926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854955.23972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854955.24193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854955.25964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854955.25974: stdout chunk (state=3): >>><<< 15406 1726854955.25985: stderr chunk (state=3): >>><<< 15406 1726854955.26011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854955.26023: handler run complete 15406 1726854955.26227: variable 'ansible_facts' from source: unknown 15406 1726854955.26391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854955.26993: variable 'ansible_facts' from source: unknown 15406 1726854955.27043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854955.27251: attempt loop complete, returning result 15406 1726854955.27262: _execute() done 15406 1726854955.27269: dumping result to json 15406 1726854955.27342: done dumping result, returning 15406 1726854955.27356: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-3c83-32d3-0000000002fb] 15406 1726854955.27365: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002fb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854955.29781: no more pending results, returning what we have 15406 1726854955.29784: results queue empty 15406 1726854955.29785: checking for any_errors_fatal 15406 1726854955.29790: done checking for any_errors_fatal 15406 1726854955.29791: checking for max_fail_percentage 15406 1726854955.29792: done checking for max_fail_percentage 15406 1726854955.29793: checking to see if all hosts have failed and the running result is not ok 15406 1726854955.29794: done checking to see if all hosts have failed 15406 1726854955.29797: getting the remaining hosts for this loop 15406 1726854955.29798: done getting the remaining hosts for this loop 15406 1726854955.29801: getting the next task for host managed_node2 15406 1726854955.29806: done getting next task for host managed_node2 15406 1726854955.29810: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15406 1726854955.29812: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854955.29822: getting variables 15406 1726854955.29823: in VariableManager get_vars() 15406 1726854955.29853: Calling all_inventory to load vars for managed_node2 15406 1726854955.29856: Calling groups_inventory to load vars for managed_node2 15406 1726854955.29858: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854955.29867: Calling all_plugins_play to load vars for managed_node2 15406 1726854955.29870: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854955.29873: Calling groups_plugins_play to load vars for managed_node2 15406 1726854955.30782: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002fb 15406 1726854955.30786: WORKER PROCESS EXITING 15406 1726854955.32252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854955.35661: done with get_vars() 15406 1726854955.35898: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:55:55 -0400 (0:00:01.837) 0:00:23.182 ****** 15406 1726854955.36001: entering _queue_task() for managed_node2/package_facts 15406 1726854955.36493: worker is 1 (out of 1 available) 15406 1726854955.36508: exiting _queue_task() for managed_node2/package_facts 15406 1726854955.36523: done queuing things up, now waiting for results queue to drain 15406 1726854955.36524: waiting for pending results... 15406 1726854955.37115: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15406 1726854955.37440: in run() - task 0affcc66-ac2b-3c83-32d3-0000000002fc 15406 1726854955.37521: variable 'ansible_search_path' from source: unknown 15406 1726854955.37530: variable 'ansible_search_path' from source: unknown 15406 1726854955.37616: calling self._execute() 15406 1726854955.37833: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854955.37847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854955.37860: variable 'omit' from source: magic vars 15406 1726854955.38679: variable 'ansible_distribution_major_version' from source: facts 15406 1726854955.38714: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854955.38893: variable 'omit' from source: magic vars 15406 1726854955.38896: variable 'omit' from source: magic vars 15406 1726854955.38929: variable 'omit' from source: magic vars 15406 1726854955.38975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854955.39082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854955.39129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854955.39174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854955.39232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854955.39375: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854955.39379: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854955.39381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854955.39514: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854955.39554: Set connection var ansible_timeout to 10 15406 1726854955.39600: Set connection var ansible_connection to ssh 15406 1726854955.39612: Set connection var ansible_shell_type to sh 15406 1726854955.39623: Set connection var ansible_shell_executable to /bin/sh 15406 1726854955.39636: Set connection var ansible_pipelining to False 15406 1726854955.39762: variable 'ansible_shell_executable' from source: unknown 15406 1726854955.39765: variable 'ansible_connection' from source: unknown 15406 1726854955.39768: variable 'ansible_module_compression' from source: unknown 15406 1726854955.39770: variable 'ansible_shell_type' from source: unknown 15406 1726854955.39772: variable 'ansible_shell_executable' from source: unknown 15406 1726854955.39774: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854955.39776: variable 'ansible_pipelining' from source: unknown 15406 1726854955.39778: variable 'ansible_timeout' from source: unknown 15406 1726854955.39790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854955.40290: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854955.40297: variable 'omit' from source: magic vars 15406 1726854955.40300: starting attempt loop 15406 1726854955.40308: running the handler 15406 1726854955.40328: _low_level_execute_command(): starting 15406 1726854955.40340: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854955.41841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.41870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854955.42143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854955.42289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854955.43990: stdout chunk (state=3): >>>/root <<< 15406 1726854955.44235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854955.44282: stderr chunk (state=3): >>><<< 15406 1726854955.44289: stdout chunk (state=3): >>><<< 15406 1726854955.44332: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854955.44336: _low_level_execute_command(): starting 15406 1726854955.44348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406 `" && echo ansible-tmp-1726854955.4431226-16479-235190287046406="` echo /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406 `" ) && sleep 0' 15406 1726854955.45540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854955.45553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854955.45578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.45732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854955.45856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854955.45963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854955.47883: stdout chunk (state=3): >>>ansible-tmp-1726854955.4431226-16479-235190287046406=/root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406 <<< 15406 1726854955.48293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854955.48297: stderr chunk (state=3): >>><<< 15406 1726854955.48300: stdout chunk (state=3): >>><<< 15406 1726854955.48303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854955.4431226-16479-235190287046406=/root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854955.48306: variable 'ansible_module_compression' from source: unknown 15406 1726854955.48309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15406 1726854955.48602: variable 'ansible_facts' from source: unknown 15406 1726854955.48964: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py 15406 1726854955.49392: Sending initial data 15406 1726854955.49448: Sent initial data (162 bytes) 15406 1726854955.51354: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.51412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854955.51446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854955.51622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854955.53209: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854955.53482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854955.53518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpoz6hyhgp /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py <<< 15406 1726854955.53522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py" <<< 15406 1726854955.53592: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpoz6hyhgp" to remote "/root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py" <<< 15406 1726854955.56497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854955.56697: stderr chunk (state=3): >>><<< 15406 1726854955.56713: stdout chunk (state=3): >>><<< 15406 1726854955.56815: done transferring module to remote 15406 1726854955.56818: _low_level_execute_command(): starting 15406 1726854955.56821: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/ /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py && sleep 0' 15406 1726854955.58138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854955.58142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854955.58144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.58147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.58223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854955.58226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854955.58320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854955.60240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854955.60243: stdout chunk (state=3): >>><<< 15406 1726854955.60246: stderr chunk (state=3): >>><<< 15406 1726854955.60699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854955.60703: _low_level_execute_command(): starting 15406 1726854955.60705: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/AnsiballZ_package_facts.py && sleep 0' 15406 1726854955.61870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854955.61873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854955.61898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.61901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854955.61919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854955.61923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854955.61991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854955.61995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854955.62018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854955.62130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854956.06393: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15406 1726854956.06412: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 15406 1726854956.06416: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 15406 1726854956.06423: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 15406 1726854956.06427: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 15406 1726854956.06430: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15406 1726854956.06436: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 15406 1726854956.06474: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 15406 1726854956.06480: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15406 1726854956.08241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854956.08432: stderr chunk (state=3): >>><<< 15406 1726854956.08436: stdout chunk (state=3): >>><<< 15406 1726854956.08702: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854956.11363: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854956.11393: _low_level_execute_command(): starting 15406 1726854956.11404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854955.4431226-16479-235190287046406/ > /dev/null 2>&1 && sleep 0' 15406 1726854956.12706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854956.12742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854956.12759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854956.12808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854956.13116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854956.15015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854956.15025: stdout chunk (state=3): >>><<< 15406 1726854956.15036: stderr chunk (state=3): >>><<< 15406 1726854956.15052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854956.15067: handler run complete 15406 1726854956.16893: variable 'ansible_facts' from source: unknown 15406 1726854956.17690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.21644: variable 'ansible_facts' from source: unknown 15406 1726854956.22435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.23960: attempt loop complete, returning result 15406 1726854956.23992: _execute() done 15406 1726854956.24008: dumping result to json 15406 1726854956.24266: done dumping result, returning 15406 1726854956.24507: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-3c83-32d3-0000000002fc] 15406 1726854956.24517: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002fc 15406 1726854956.29135: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000002fc 15406 1726854956.29139: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854956.29243: no more pending results, returning what we have 15406 1726854956.29246: results queue empty 15406 1726854956.29247: checking for any_errors_fatal 15406 1726854956.29252: done checking for any_errors_fatal 15406 1726854956.29252: checking for max_fail_percentage 15406 1726854956.29254: done checking for max_fail_percentage 15406 1726854956.29255: checking to see if all hosts have failed and the running result is not ok 15406 1726854956.29256: done checking to see if all hosts have failed 15406 1726854956.29256: getting the remaining hosts for this loop 15406 1726854956.29260: done getting the remaining hosts for this loop 15406 1726854956.29265: getting the next task for host managed_node2 15406 1726854956.29272: done getting next task for host managed_node2 15406 1726854956.29276: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15406 1726854956.29278: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854956.29289: getting variables 15406 1726854956.29291: in VariableManager get_vars() 15406 1726854956.29322: Calling all_inventory to load vars for managed_node2 15406 1726854956.29325: Calling groups_inventory to load vars for managed_node2 15406 1726854956.29328: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854956.29336: Calling all_plugins_play to load vars for managed_node2 15406 1726854956.29339: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854956.29341: Calling groups_plugins_play to load vars for managed_node2 15406 1726854956.30481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.33624: done with get_vars() 15406 1726854956.33656: done getting variables 15406 1726854956.33719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:55:56 -0400 (0:00:00.977) 0:00:24.160 ****** 15406 1726854956.33751: entering _queue_task() for managed_node2/debug 15406 1726854956.34775: worker is 1 (out of 1 available) 15406 1726854956.34991: exiting _queue_task() for managed_node2/debug 15406 1726854956.35002: done queuing things up, now waiting for results queue to drain 15406 1726854956.35003: waiting for pending results... 15406 1726854956.35715: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 15406 1726854956.36059: in run() - task 0affcc66-ac2b-3c83-32d3-00000000003b 15406 1726854956.36063: variable 'ansible_search_path' from source: unknown 15406 1726854956.36066: variable 'ansible_search_path' from source: unknown 15406 1726854956.36069: calling self._execute() 15406 1726854956.36194: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.36500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.36505: variable 'omit' from source: magic vars 15406 1726854956.37128: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.37266: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854956.37278: variable 'omit' from source: magic vars 15406 1726854956.37330: variable 'omit' from source: magic vars 15406 1726854956.37533: variable 'network_provider' from source: set_fact 15406 1726854956.37644: variable 'omit' from source: magic vars 15406 1726854956.37806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854956.37939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854956.37944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854956.38409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854956.38413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854956.38416: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854956.38419: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.38422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.38793: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854956.38808: Set connection var ansible_timeout to 10 15406 1726854956.38812: Set connection var ansible_connection to ssh 15406 1726854956.38815: Set connection var ansible_shell_type to sh 15406 1726854956.38817: Set connection var ansible_shell_executable to /bin/sh 15406 1726854956.38819: Set connection var ansible_pipelining to False 15406 1726854956.38821: variable 'ansible_shell_executable' from source: unknown 15406 1726854956.38824: variable 'ansible_connection' from source: unknown 15406 1726854956.38826: variable 'ansible_module_compression' from source: unknown 15406 1726854956.38828: variable 'ansible_shell_type' from source: unknown 15406 1726854956.38831: variable 'ansible_shell_executable' from source: unknown 15406 1726854956.38833: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.38835: variable 'ansible_pipelining' from source: unknown 15406 1726854956.38838: variable 'ansible_timeout' from source: unknown 15406 1726854956.38840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.39503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854956.39507: variable 'omit' from source: magic vars 15406 1726854956.39510: starting attempt loop 15406 1726854956.39512: running the handler 15406 1726854956.39514: handler run complete 15406 1726854956.39516: attempt loop complete, returning result 15406 1726854956.39518: _execute() done 15406 1726854956.39520: dumping result to json 15406 1726854956.39522: done dumping result, returning 15406 1726854956.39524: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-3c83-32d3-00000000003b] 15406 1726854956.39526: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003b 15406 1726854956.39593: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003b 15406 1726854956.39697: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 15406 1726854956.39851: no more pending results, returning what we have 15406 1726854956.39855: results queue empty 15406 1726854956.39856: checking for any_errors_fatal 15406 1726854956.39870: done checking for any_errors_fatal 15406 1726854956.39871: checking for max_fail_percentage 15406 1726854956.39873: done checking for max_fail_percentage 15406 1726854956.39874: checking to see if all hosts have failed and the running result is not ok 15406 1726854956.39875: done checking to see if all hosts have failed 15406 1726854956.39876: getting the remaining hosts for this loop 15406 1726854956.39877: done getting the remaining hosts for this loop 15406 1726854956.39881: getting the next task for host managed_node2 15406 1726854956.39891: done getting next task for host managed_node2 15406 1726854956.39896: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15406 1726854956.39898: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854956.39909: getting variables 15406 1726854956.39911: in VariableManager get_vars() 15406 1726854956.39947: Calling all_inventory to load vars for managed_node2 15406 1726854956.39950: Calling groups_inventory to load vars for managed_node2 15406 1726854956.39953: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854956.39964: Calling all_plugins_play to load vars for managed_node2 15406 1726854956.39967: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854956.39971: Calling groups_plugins_play to load vars for managed_node2 15406 1726854956.43144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.45821: done with get_vars() 15406 1726854956.45855: done getting variables 15406 1726854956.45924: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:55:56 -0400 (0:00:00.122) 0:00:24.282 ****** 15406 1726854956.45959: entering _queue_task() for managed_node2/fail 15406 1726854956.46327: worker is 1 (out of 1 available) 15406 1726854956.46341: exiting _queue_task() for managed_node2/fail 15406 1726854956.46354: done queuing things up, now waiting for results queue to drain 15406 1726854956.46356: waiting for pending results... 15406 1726854956.46650: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15406 1726854956.46774: in run() - task 0affcc66-ac2b-3c83-32d3-00000000003c 15406 1726854956.46802: variable 'ansible_search_path' from source: unknown 15406 1726854956.46815: variable 'ansible_search_path' from source: unknown 15406 1726854956.46858: calling self._execute() 15406 1726854956.46967: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.46981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.46999: variable 'omit' from source: magic vars 15406 1726854956.47522: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.47526: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854956.47646: variable 'network_state' from source: role '' defaults 15406 1726854956.47664: Evaluated conditional (network_state != {}): False 15406 1726854956.47671: when evaluation is False, skipping this task 15406 1726854956.47682: _execute() done 15406 1726854956.47693: dumping result to json 15406 1726854956.47704: done dumping result, returning 15406 1726854956.47717: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-3c83-32d3-00000000003c] 15406 1726854956.47726: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854956.47891: no more pending results, returning what we have 15406 1726854956.47895: results queue empty 15406 1726854956.47897: checking for any_errors_fatal 15406 1726854956.47906: done checking for any_errors_fatal 15406 1726854956.47907: checking for max_fail_percentage 15406 1726854956.47908: done checking for max_fail_percentage 15406 1726854956.47909: checking to see if all hosts have failed and the running result is not ok 15406 1726854956.47910: done checking to see if all hosts have failed 15406 1726854956.47910: getting the remaining hosts for this loop 15406 1726854956.47912: done getting the remaining hosts for this loop 15406 1726854956.47916: getting the next task for host managed_node2 15406 1726854956.47922: done getting next task for host managed_node2 15406 1726854956.47926: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15406 1726854956.47929: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854956.47944: getting variables 15406 1726854956.47946: in VariableManager get_vars() 15406 1726854956.47985: Calling all_inventory to load vars for managed_node2 15406 1726854956.47990: Calling groups_inventory to load vars for managed_node2 15406 1726854956.47993: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854956.48006: Calling all_plugins_play to load vars for managed_node2 15406 1726854956.48008: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854956.48011: Calling groups_plugins_play to load vars for managed_node2 15406 1726854956.49096: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003c 15406 1726854956.49099: WORKER PROCESS EXITING 15406 1726854956.51185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.54303: done with get_vars() 15406 1726854956.54336: done getting variables 15406 1726854956.54597: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:55:56 -0400 (0:00:00.086) 0:00:24.369 ****** 15406 1726854956.54627: entering _queue_task() for managed_node2/fail 15406 1726854956.55424: worker is 1 (out of 1 available) 15406 1726854956.55438: exiting _queue_task() for managed_node2/fail 15406 1726854956.55450: done queuing things up, now waiting for results queue to drain 15406 1726854956.55451: waiting for pending results... 15406 1726854956.55903: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15406 1726854956.56126: in run() - task 0affcc66-ac2b-3c83-32d3-00000000003d 15406 1726854956.56134: variable 'ansible_search_path' from source: unknown 15406 1726854956.56138: variable 'ansible_search_path' from source: unknown 15406 1726854956.56176: calling self._execute() 15406 1726854956.56386: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.56504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.56517: variable 'omit' from source: magic vars 15406 1726854956.57330: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.57339: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854956.57715: variable 'network_state' from source: role '' defaults 15406 1726854956.57728: Evaluated conditional (network_state != {}): False 15406 1726854956.57737: when evaluation is False, skipping this task 15406 1726854956.57740: _execute() done 15406 1726854956.57743: dumping result to json 15406 1726854956.57745: done dumping result, returning 15406 1726854956.57871: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-3c83-32d3-00000000003d] 15406 1726854956.57875: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003d 15406 1726854956.57944: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003d 15406 1726854956.57947: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854956.57999: no more pending results, returning what we have 15406 1726854956.58002: results queue empty 15406 1726854956.58004: checking for any_errors_fatal 15406 1726854956.58010: done checking for any_errors_fatal 15406 1726854956.58011: checking for max_fail_percentage 15406 1726854956.58012: done checking for max_fail_percentage 15406 1726854956.58013: checking to see if all hosts have failed and the running result is not ok 15406 1726854956.58014: done checking to see if all hosts have failed 15406 1726854956.58015: getting the remaining hosts for this loop 15406 1726854956.58016: done getting the remaining hosts for this loop 15406 1726854956.58020: getting the next task for host managed_node2 15406 1726854956.58025: done getting next task for host managed_node2 15406 1726854956.58029: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15406 1726854956.58031: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854956.58044: getting variables 15406 1726854956.58046: in VariableManager get_vars() 15406 1726854956.58081: Calling all_inventory to load vars for managed_node2 15406 1726854956.58084: Calling groups_inventory to load vars for managed_node2 15406 1726854956.58086: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854956.58099: Calling all_plugins_play to load vars for managed_node2 15406 1726854956.58102: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854956.58104: Calling groups_plugins_play to load vars for managed_node2 15406 1726854956.61558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.64774: done with get_vars() 15406 1726854956.65010: done getting variables 15406 1726854956.65064: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:55:56 -0400 (0:00:00.104) 0:00:24.473 ****** 15406 1726854956.65094: entering _queue_task() for managed_node2/fail 15406 1726854956.66023: worker is 1 (out of 1 available) 15406 1726854956.66034: exiting _queue_task() for managed_node2/fail 15406 1726854956.66046: done queuing things up, now waiting for results queue to drain 15406 1726854956.66047: waiting for pending results... 15406 1726854956.66507: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15406 1726854956.66514: in run() - task 0affcc66-ac2b-3c83-32d3-00000000003e 15406 1726854956.66517: variable 'ansible_search_path' from source: unknown 15406 1726854956.66520: variable 'ansible_search_path' from source: unknown 15406 1726854956.66635: calling self._execute() 15406 1726854956.66901: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.66914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.66933: variable 'omit' from source: magic vars 15406 1726854956.67754: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.67773: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854956.68061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854956.72777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854956.72953: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854956.72996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854956.73295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854956.73298: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854956.73340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.73376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.73433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.73553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.73572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.73946: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.73949: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15406 1726854956.74109: variable 'ansible_distribution' from source: facts 15406 1726854956.74119: variable '__network_rh_distros' from source: role '' defaults 15406 1726854956.74134: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15406 1726854956.74630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.74732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.74803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.74911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.74937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.74992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.75152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.75294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.75297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.75300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.75478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.75658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.75686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.75810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.75813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.76680: variable 'network_connections' from source: play vars 15406 1726854956.76750: variable 'profile' from source: play vars 15406 1726854956.76819: variable 'profile' from source: play vars 15406 1726854956.76822: variable 'interface' from source: set_fact 15406 1726854956.77090: variable 'interface' from source: set_fact 15406 1726854956.77103: variable 'network_state' from source: role '' defaults 15406 1726854956.77170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854956.77569: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854956.77746: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854956.77774: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854956.77804: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854956.77931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854956.78071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854956.78385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.78417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854956.78493: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15406 1726854956.78497: when evaluation is False, skipping this task 15406 1726854956.78499: _execute() done 15406 1726854956.78502: dumping result to json 15406 1726854956.78509: done dumping result, returning 15406 1726854956.78512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-3c83-32d3-00000000003e] 15406 1726854956.78514: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003e 15406 1726854956.78571: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003e 15406 1726854956.78573: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15406 1726854956.78660: no more pending results, returning what we have 15406 1726854956.78663: results queue empty 15406 1726854956.78664: checking for any_errors_fatal 15406 1726854956.78672: done checking for any_errors_fatal 15406 1726854956.78673: checking for max_fail_percentage 15406 1726854956.78675: done checking for max_fail_percentage 15406 1726854956.78676: checking to see if all hosts have failed and the running result is not ok 15406 1726854956.78676: done checking to see if all hosts have failed 15406 1726854956.78677: getting the remaining hosts for this loop 15406 1726854956.78678: done getting the remaining hosts for this loop 15406 1726854956.78683: getting the next task for host managed_node2 15406 1726854956.78692: done getting next task for host managed_node2 15406 1726854956.78697: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15406 1726854956.78699: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854956.78714: getting variables 15406 1726854956.78716: in VariableManager get_vars() 15406 1726854956.78761: Calling all_inventory to load vars for managed_node2 15406 1726854956.78764: Calling groups_inventory to load vars for managed_node2 15406 1726854956.78766: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854956.78779: Calling all_plugins_play to load vars for managed_node2 15406 1726854956.78782: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854956.78785: Calling groups_plugins_play to load vars for managed_node2 15406 1726854956.81730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854956.85414: done with get_vars() 15406 1726854956.85604: done getting variables 15406 1726854956.85670: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:55:56 -0400 (0:00:00.206) 0:00:24.679 ****** 15406 1726854956.85704: entering _queue_task() for managed_node2/dnf 15406 1726854956.86454: worker is 1 (out of 1 available) 15406 1726854956.86466: exiting _queue_task() for managed_node2/dnf 15406 1726854956.86477: done queuing things up, now waiting for results queue to drain 15406 1726854956.86479: waiting for pending results... 15406 1726854956.87105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15406 1726854956.87272: in run() - task 0affcc66-ac2b-3c83-32d3-00000000003f 15406 1726854956.87289: variable 'ansible_search_path' from source: unknown 15406 1726854956.87293: variable 'ansible_search_path' from source: unknown 15406 1726854956.87328: calling self._execute() 15406 1726854956.87580: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854956.87655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854956.87665: variable 'omit' from source: magic vars 15406 1726854956.89098: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.89112: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854956.89834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854956.95150: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854956.95271: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854956.95327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854956.95468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854956.95599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854956.95673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.95764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.95858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.95982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.96055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.96386: variable 'ansible_distribution' from source: facts 15406 1726854956.96394: variable 'ansible_distribution_major_version' from source: facts 15406 1726854956.96402: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15406 1726854956.96649: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854956.97220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.97224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.97226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.97244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.97280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.97401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.97550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.97554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.97556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.97571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.97703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854956.97731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854956.97764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.97894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854956.97900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854956.98039: variable 'network_connections' from source: play vars 15406 1726854956.98056: variable 'profile' from source: play vars 15406 1726854956.98126: variable 'profile' from source: play vars 15406 1726854956.98135: variable 'interface' from source: set_fact 15406 1726854956.98195: variable 'interface' from source: set_fact 15406 1726854956.98277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854956.98520: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854956.98567: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854956.98606: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854956.98670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854956.98756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854956.98789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854956.98873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854956.98891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854956.98941: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854956.99358: variable 'network_connections' from source: play vars 15406 1726854956.99370: variable 'profile' from source: play vars 15406 1726854956.99462: variable 'profile' from source: play vars 15406 1726854956.99472: variable 'interface' from source: set_fact 15406 1726854956.99539: variable 'interface' from source: set_fact 15406 1726854956.99575: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854956.99638: when evaluation is False, skipping this task 15406 1726854956.99647: _execute() done 15406 1726854956.99659: dumping result to json 15406 1726854956.99667: done dumping result, returning 15406 1726854956.99684: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-00000000003f] 15406 1726854956.99699: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003f 15406 1726854956.99917: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000003f 15406 1726854956.99920: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854956.99970: no more pending results, returning what we have 15406 1726854956.99973: results queue empty 15406 1726854956.99974: checking for any_errors_fatal 15406 1726854956.99982: done checking for any_errors_fatal 15406 1726854956.99983: checking for max_fail_percentage 15406 1726854956.99985: done checking for max_fail_percentage 15406 1726854956.99986: checking to see if all hosts have failed and the running result is not ok 15406 1726854956.99988: done checking to see if all hosts have failed 15406 1726854956.99989: getting the remaining hosts for this loop 15406 1726854956.99990: done getting the remaining hosts for this loop 15406 1726854956.99994: getting the next task for host managed_node2 15406 1726854957.00002: done getting next task for host managed_node2 15406 1726854957.00006: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15406 1726854957.00008: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.00022: getting variables 15406 1726854957.00024: in VariableManager get_vars() 15406 1726854957.00066: Calling all_inventory to load vars for managed_node2 15406 1726854957.00069: Calling groups_inventory to load vars for managed_node2 15406 1726854957.00072: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.00083: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.00086: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.00307: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.02883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.06322: done with get_vars() 15406 1726854957.06358: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15406 1726854957.06731: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:55:57 -0400 (0:00:00.210) 0:00:24.890 ****** 15406 1726854957.06764: entering _queue_task() for managed_node2/yum 15406 1726854957.07633: worker is 1 (out of 1 available) 15406 1726854957.07646: exiting _queue_task() for managed_node2/yum 15406 1726854957.07658: done queuing things up, now waiting for results queue to drain 15406 1726854957.07659: waiting for pending results... 15406 1726854957.08320: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15406 1726854957.08402: in run() - task 0affcc66-ac2b-3c83-32d3-000000000040 15406 1726854957.08438: variable 'ansible_search_path' from source: unknown 15406 1726854957.08441: variable 'ansible_search_path' from source: unknown 15406 1726854957.08505: calling self._execute() 15406 1726854957.08611: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.08625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.08655: variable 'omit' from source: magic vars 15406 1726854957.09024: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.09034: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854957.09161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854957.11012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854957.11015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854957.11494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854957.11500: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854957.11502: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854957.11582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.11729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.11757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.12092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.12096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.12101: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.12103: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15406 1726854957.12105: when evaluation is False, skipping this task 15406 1726854957.12292: _execute() done 15406 1726854957.12298: dumping result to json 15406 1726854957.12301: done dumping result, returning 15406 1726854957.12336: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000040] 15406 1726854957.12405: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000040 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15406 1726854957.12561: no more pending results, returning what we have 15406 1726854957.12564: results queue empty 15406 1726854957.12565: checking for any_errors_fatal 15406 1726854957.12573: done checking for any_errors_fatal 15406 1726854957.12574: checking for max_fail_percentage 15406 1726854957.12576: done checking for max_fail_percentage 15406 1726854957.12576: checking to see if all hosts have failed and the running result is not ok 15406 1726854957.12577: done checking to see if all hosts have failed 15406 1726854957.12578: getting the remaining hosts for this loop 15406 1726854957.12579: done getting the remaining hosts for this loop 15406 1726854957.12582: getting the next task for host managed_node2 15406 1726854957.12590: done getting next task for host managed_node2 15406 1726854957.12594: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15406 1726854957.12596: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.12611: getting variables 15406 1726854957.12613: in VariableManager get_vars() 15406 1726854957.12650: Calling all_inventory to load vars for managed_node2 15406 1726854957.12653: Calling groups_inventory to load vars for managed_node2 15406 1726854957.12656: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.12667: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.12670: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.12674: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.13466: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000040 15406 1726854957.13470: WORKER PROCESS EXITING 15406 1726854957.14867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.18339: done with get_vars() 15406 1726854957.18430: done getting variables 15406 1726854957.18495: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:55:57 -0400 (0:00:00.118) 0:00:25.008 ****** 15406 1726854957.18600: entering _queue_task() for managed_node2/fail 15406 1726854957.19275: worker is 1 (out of 1 available) 15406 1726854957.19292: exiting _queue_task() for managed_node2/fail 15406 1726854957.19307: done queuing things up, now waiting for results queue to drain 15406 1726854957.19309: waiting for pending results... 15406 1726854957.19722: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15406 1726854957.19857: in run() - task 0affcc66-ac2b-3c83-32d3-000000000041 15406 1726854957.19871: variable 'ansible_search_path' from source: unknown 15406 1726854957.19875: variable 'ansible_search_path' from source: unknown 15406 1726854957.19924: calling self._execute() 15406 1726854957.20064: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.20069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.20078: variable 'omit' from source: magic vars 15406 1726854957.20669: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.20679: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854957.20806: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854957.21015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854957.24032: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854957.24166: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854957.24308: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854957.24426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854957.24489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854957.24594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.24629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.24656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.24708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.24721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.24767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.24789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.24845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.24912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.24976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.25143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.25190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.25217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.25293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.25297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.25670: variable 'network_connections' from source: play vars 15406 1726854957.25682: variable 'profile' from source: play vars 15406 1726854957.25782: variable 'profile' from source: play vars 15406 1726854957.25785: variable 'interface' from source: set_fact 15406 1726854957.25859: variable 'interface' from source: set_fact 15406 1726854957.25961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854957.26157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854957.26190: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854957.26223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854957.26258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854957.26308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854957.26326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854957.26353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.26383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854957.26525: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854957.26731: variable 'network_connections' from source: play vars 15406 1726854957.26734: variable 'profile' from source: play vars 15406 1726854957.26841: variable 'profile' from source: play vars 15406 1726854957.26844: variable 'interface' from source: set_fact 15406 1726854957.26846: variable 'interface' from source: set_fact 15406 1726854957.26874: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854957.26877: when evaluation is False, skipping this task 15406 1726854957.26879: _execute() done 15406 1726854957.26882: dumping result to json 15406 1726854957.26884: done dumping result, returning 15406 1726854957.27055: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000041] 15406 1726854957.27066: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000041 15406 1726854957.27139: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000041 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854957.27200: no more pending results, returning what we have 15406 1726854957.27204: results queue empty 15406 1726854957.27205: checking for any_errors_fatal 15406 1726854957.27214: done checking for any_errors_fatal 15406 1726854957.27215: checking for max_fail_percentage 15406 1726854957.27217: done checking for max_fail_percentage 15406 1726854957.27217: checking to see if all hosts have failed and the running result is not ok 15406 1726854957.27218: done checking to see if all hosts have failed 15406 1726854957.27219: getting the remaining hosts for this loop 15406 1726854957.27220: done getting the remaining hosts for this loop 15406 1726854957.27224: getting the next task for host managed_node2 15406 1726854957.27235: done getting next task for host managed_node2 15406 1726854957.27239: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15406 1726854957.27242: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.27260: getting variables 15406 1726854957.27262: in VariableManager get_vars() 15406 1726854957.27304: Calling all_inventory to load vars for managed_node2 15406 1726854957.27307: Calling groups_inventory to load vars for managed_node2 15406 1726854957.27309: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.27321: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.27324: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.27327: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.28040: WORKER PROCESS EXITING 15406 1726854957.29339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.31848: done with get_vars() 15406 1726854957.31878: done getting variables 15406 1726854957.31942: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:55:57 -0400 (0:00:00.133) 0:00:25.142 ****** 15406 1726854957.31974: entering _queue_task() for managed_node2/package 15406 1726854957.32323: worker is 1 (out of 1 available) 15406 1726854957.32337: exiting _queue_task() for managed_node2/package 15406 1726854957.32350: done queuing things up, now waiting for results queue to drain 15406 1726854957.32351: waiting for pending results... 15406 1726854957.32583: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 15406 1726854957.32706: in run() - task 0affcc66-ac2b-3c83-32d3-000000000042 15406 1726854957.32729: variable 'ansible_search_path' from source: unknown 15406 1726854957.32738: variable 'ansible_search_path' from source: unknown 15406 1726854957.32792: calling self._execute() 15406 1726854957.32913: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.32932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.32975: variable 'omit' from source: magic vars 15406 1726854957.33369: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.33389: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854957.33605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854957.33891: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854957.33953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854957.33994: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854957.34030: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854957.34148: variable 'network_packages' from source: role '' defaults 15406 1726854957.34262: variable '__network_provider_setup' from source: role '' defaults 15406 1726854957.34389: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854957.34392: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854957.34394: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854957.34579: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854957.34816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854957.43817: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854957.43893: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854957.43947: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854957.43982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854957.44181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854957.44189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.44452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.44456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.44488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.44509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.44550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.44582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.44610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.44652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.44833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.45483: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15406 1726854957.45759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.45876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.45879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.45882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.45909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.46011: variable 'ansible_python' from source: facts 15406 1726854957.46038: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15406 1726854957.46133: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854957.46600: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854957.46604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.46607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.46821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.46862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.46878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.46970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.47024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.47058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.47105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.47147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.47306: variable 'network_connections' from source: play vars 15406 1726854957.47317: variable 'profile' from source: play vars 15406 1726854957.47471: variable 'profile' from source: play vars 15406 1726854957.47474: variable 'interface' from source: set_fact 15406 1726854957.47521: variable 'interface' from source: set_fact 15406 1726854957.47599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854957.47630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854957.47660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.48166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854957.48169: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854957.48815: variable 'network_connections' from source: play vars 15406 1726854957.48818: variable 'profile' from source: play vars 15406 1726854957.48958: variable 'profile' from source: play vars 15406 1726854957.48969: variable 'interface' from source: set_fact 15406 1726854957.49089: variable 'interface' from source: set_fact 15406 1726854957.49180: variable '__network_packages_default_wireless' from source: role '' defaults 15406 1726854957.49339: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854957.50057: variable 'network_connections' from source: play vars 15406 1726854957.50105: variable 'profile' from source: play vars 15406 1726854957.50293: variable 'profile' from source: play vars 15406 1726854957.50299: variable 'interface' from source: set_fact 15406 1726854957.50698: variable 'interface' from source: set_fact 15406 1726854957.50702: variable '__network_packages_default_team' from source: role '' defaults 15406 1726854957.50798: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854957.51415: variable 'network_connections' from source: play vars 15406 1726854957.51900: variable 'profile' from source: play vars 15406 1726854957.51903: variable 'profile' from source: play vars 15406 1726854957.51907: variable 'interface' from source: set_fact 15406 1726854957.52069: variable 'interface' from source: set_fact 15406 1726854957.52279: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854957.52595: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854957.52611: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854957.52793: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854957.53236: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15406 1726854957.54392: variable 'network_connections' from source: play vars 15406 1726854957.54396: variable 'profile' from source: play vars 15406 1726854957.54455: variable 'profile' from source: play vars 15406 1726854957.54535: variable 'interface' from source: set_fact 15406 1726854957.54695: variable 'interface' from source: set_fact 15406 1726854957.54700: variable 'ansible_distribution' from source: facts 15406 1726854957.54702: variable '__network_rh_distros' from source: role '' defaults 15406 1726854957.54795: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.54802: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15406 1726854957.55225: variable 'ansible_distribution' from source: facts 15406 1726854957.55285: variable '__network_rh_distros' from source: role '' defaults 15406 1726854957.55299: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.55315: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15406 1726854957.55792: variable 'ansible_distribution' from source: facts 15406 1726854957.55795: variable '__network_rh_distros' from source: role '' defaults 15406 1726854957.55800: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.55819: variable 'network_provider' from source: set_fact 15406 1726854957.55838: variable 'ansible_facts' from source: unknown 15406 1726854957.57247: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15406 1726854957.57256: when evaluation is False, skipping this task 15406 1726854957.57263: _execute() done 15406 1726854957.57270: dumping result to json 15406 1726854957.57276: done dumping result, returning 15406 1726854957.57289: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-3c83-32d3-000000000042] 15406 1726854957.57300: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000042 15406 1726854957.57565: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000042 15406 1726854957.57569: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15406 1726854957.57622: no more pending results, returning what we have 15406 1726854957.57626: results queue empty 15406 1726854957.57627: checking for any_errors_fatal 15406 1726854957.57633: done checking for any_errors_fatal 15406 1726854957.57634: checking for max_fail_percentage 15406 1726854957.57636: done checking for max_fail_percentage 15406 1726854957.57637: checking to see if all hosts have failed and the running result is not ok 15406 1726854957.57638: done checking to see if all hosts have failed 15406 1726854957.57639: getting the remaining hosts for this loop 15406 1726854957.57640: done getting the remaining hosts for this loop 15406 1726854957.57644: getting the next task for host managed_node2 15406 1726854957.57650: done getting next task for host managed_node2 15406 1726854957.57654: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15406 1726854957.57656: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.57669: getting variables 15406 1726854957.57671: in VariableManager get_vars() 15406 1726854957.57835: Calling all_inventory to load vars for managed_node2 15406 1726854957.57838: Calling groups_inventory to load vars for managed_node2 15406 1726854957.57841: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.57855: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.57858: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.57861: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.65910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.67615: done with get_vars() 15406 1726854957.67650: done getting variables 15406 1726854957.67705: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:55:57 -0400 (0:00:00.357) 0:00:25.500 ****** 15406 1726854957.67733: entering _queue_task() for managed_node2/package 15406 1726854957.68301: worker is 1 (out of 1 available) 15406 1726854957.68311: exiting _queue_task() for managed_node2/package 15406 1726854957.68322: done queuing things up, now waiting for results queue to drain 15406 1726854957.68324: waiting for pending results... 15406 1726854957.68512: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15406 1726854957.68608: in run() - task 0affcc66-ac2b-3c83-32d3-000000000043 15406 1726854957.68612: variable 'ansible_search_path' from source: unknown 15406 1726854957.68615: variable 'ansible_search_path' from source: unknown 15406 1726854957.68624: calling self._execute() 15406 1726854957.68735: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.68746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.68764: variable 'omit' from source: magic vars 15406 1726854957.69477: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.69480: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854957.69777: variable 'network_state' from source: role '' defaults 15406 1726854957.69849: Evaluated conditional (network_state != {}): False 15406 1726854957.70029: when evaluation is False, skipping this task 15406 1726854957.70033: _execute() done 15406 1726854957.70036: dumping result to json 15406 1726854957.70038: done dumping result, returning 15406 1726854957.70041: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-3c83-32d3-000000000043] 15406 1726854957.70044: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000043 15406 1726854957.70131: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000043 15406 1726854957.70134: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854957.70186: no more pending results, returning what we have 15406 1726854957.70192: results queue empty 15406 1726854957.70193: checking for any_errors_fatal 15406 1726854957.70206: done checking for any_errors_fatal 15406 1726854957.70207: checking for max_fail_percentage 15406 1726854957.70209: done checking for max_fail_percentage 15406 1726854957.70210: checking to see if all hosts have failed and the running result is not ok 15406 1726854957.70211: done checking to see if all hosts have failed 15406 1726854957.70211: getting the remaining hosts for this loop 15406 1726854957.70213: done getting the remaining hosts for this loop 15406 1726854957.70217: getting the next task for host managed_node2 15406 1726854957.70224: done getting next task for host managed_node2 15406 1726854957.70228: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15406 1726854957.70230: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.70361: getting variables 15406 1726854957.70363: in VariableManager get_vars() 15406 1726854957.70409: Calling all_inventory to load vars for managed_node2 15406 1726854957.70411: Calling groups_inventory to load vars for managed_node2 15406 1726854957.70414: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.70428: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.70431: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.70435: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.73681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.77041: done with get_vars() 15406 1726854957.77072: done getting variables 15406 1726854957.77251: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:55:57 -0400 (0:00:00.095) 0:00:25.595 ****** 15406 1726854957.77284: entering _queue_task() for managed_node2/package 15406 1726854957.78237: worker is 1 (out of 1 available) 15406 1726854957.78249: exiting _queue_task() for managed_node2/package 15406 1726854957.78262: done queuing things up, now waiting for results queue to drain 15406 1726854957.78264: waiting for pending results... 15406 1726854957.78959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15406 1726854957.78964: in run() - task 0affcc66-ac2b-3c83-32d3-000000000044 15406 1726854957.78968: variable 'ansible_search_path' from source: unknown 15406 1726854957.78970: variable 'ansible_search_path' from source: unknown 15406 1726854957.78993: calling self._execute() 15406 1726854957.79333: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.79337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.79341: variable 'omit' from source: magic vars 15406 1726854957.79779: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.79803: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854957.79952: variable 'network_state' from source: role '' defaults 15406 1726854957.79968: Evaluated conditional (network_state != {}): False 15406 1726854957.79975: when evaluation is False, skipping this task 15406 1726854957.79985: _execute() done 15406 1726854957.79995: dumping result to json 15406 1726854957.80045: done dumping result, returning 15406 1726854957.80050: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-3c83-32d3-000000000044] 15406 1726854957.80053: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000044 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854957.80312: no more pending results, returning what we have 15406 1726854957.80316: results queue empty 15406 1726854957.80317: checking for any_errors_fatal 15406 1726854957.80326: done checking for any_errors_fatal 15406 1726854957.80326: checking for max_fail_percentage 15406 1726854957.80328: done checking for max_fail_percentage 15406 1726854957.80329: checking to see if all hosts have failed and the running result is not ok 15406 1726854957.80330: done checking to see if all hosts have failed 15406 1726854957.80330: getting the remaining hosts for this loop 15406 1726854957.80332: done getting the remaining hosts for this loop 15406 1726854957.80335: getting the next task for host managed_node2 15406 1726854957.80341: done getting next task for host managed_node2 15406 1726854957.80345: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15406 1726854957.80347: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.80478: getting variables 15406 1726854957.80481: in VariableManager get_vars() 15406 1726854957.80525: Calling all_inventory to load vars for managed_node2 15406 1726854957.80529: Calling groups_inventory to load vars for managed_node2 15406 1726854957.80532: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.80542: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000044 15406 1726854957.80545: WORKER PROCESS EXITING 15406 1726854957.80557: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.80560: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.80563: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.82309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.84072: done with get_vars() 15406 1726854957.84100: done getting variables 15406 1726854957.84162: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:55:57 -0400 (0:00:00.069) 0:00:25.664 ****** 15406 1726854957.84192: entering _queue_task() for managed_node2/service 15406 1726854957.84693: worker is 1 (out of 1 available) 15406 1726854957.84706: exiting _queue_task() for managed_node2/service 15406 1726854957.84716: done queuing things up, now waiting for results queue to drain 15406 1726854957.84717: waiting for pending results... 15406 1726854957.85013: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15406 1726854957.85020: in run() - task 0affcc66-ac2b-3c83-32d3-000000000045 15406 1726854957.85023: variable 'ansible_search_path' from source: unknown 15406 1726854957.85026: variable 'ansible_search_path' from source: unknown 15406 1726854957.85064: calling self._execute() 15406 1726854957.85182: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.85201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.85225: variable 'omit' from source: magic vars 15406 1726854957.85655: variable 'ansible_distribution_major_version' from source: facts 15406 1726854957.85673: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854957.85832: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854957.86095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854957.89619: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854957.89710: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854957.89759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854957.89806: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854957.89840: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854957.89942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.89986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.90021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.90063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.90091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.90202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.90206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.90209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.90255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.90275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.90334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854957.90360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854957.90386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.90442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854957.90493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854957.90670: variable 'network_connections' from source: play vars 15406 1726854957.90690: variable 'profile' from source: play vars 15406 1726854957.90772: variable 'profile' from source: play vars 15406 1726854957.90781: variable 'interface' from source: set_fact 15406 1726854957.90857: variable 'interface' from source: set_fact 15406 1726854957.90965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854957.91138: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854957.91189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854957.91227: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854957.91295: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854957.91318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854957.91345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854957.91375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854957.91421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854957.91492: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854957.91753: variable 'network_connections' from source: play vars 15406 1726854957.91764: variable 'profile' from source: play vars 15406 1726854957.91943: variable 'profile' from source: play vars 15406 1726854957.91946: variable 'interface' from source: set_fact 15406 1726854957.91948: variable 'interface' from source: set_fact 15406 1726854957.91951: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854957.91953: when evaluation is False, skipping this task 15406 1726854957.91958: _execute() done 15406 1726854957.91966: dumping result to json 15406 1726854957.91973: done dumping result, returning 15406 1726854957.91983: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000045] 15406 1726854957.92006: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000045 15406 1726854957.92143: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000045 15406 1726854957.92147: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854957.92209: no more pending results, returning what we have 15406 1726854957.92213: results queue empty 15406 1726854957.92215: checking for any_errors_fatal 15406 1726854957.92222: done checking for any_errors_fatal 15406 1726854957.92223: checking for max_fail_percentage 15406 1726854957.92225: done checking for max_fail_percentage 15406 1726854957.92226: checking to see if all hosts have failed and the running result is not ok 15406 1726854957.92227: done checking to see if all hosts have failed 15406 1726854957.92227: getting the remaining hosts for this loop 15406 1726854957.92229: done getting the remaining hosts for this loop 15406 1726854957.92233: getting the next task for host managed_node2 15406 1726854957.92239: done getting next task for host managed_node2 15406 1726854957.92243: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15406 1726854957.92245: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854957.92258: getting variables 15406 1726854957.92262: in VariableManager get_vars() 15406 1726854957.92306: Calling all_inventory to load vars for managed_node2 15406 1726854957.92309: Calling groups_inventory to load vars for managed_node2 15406 1726854957.92312: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854957.92323: Calling all_plugins_play to load vars for managed_node2 15406 1726854957.92326: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854957.92328: Calling groups_plugins_play to load vars for managed_node2 15406 1726854957.94995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854957.98538: done with get_vars() 15406 1726854957.98562: done getting variables 15406 1726854957.98632: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:55:57 -0400 (0:00:00.144) 0:00:25.809 ****** 15406 1726854957.98663: entering _queue_task() for managed_node2/service 15406 1726854957.99011: worker is 1 (out of 1 available) 15406 1726854957.99026: exiting _queue_task() for managed_node2/service 15406 1726854957.99045: done queuing things up, now waiting for results queue to drain 15406 1726854957.99046: waiting for pending results... 15406 1726854957.99377: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15406 1726854957.99382: in run() - task 0affcc66-ac2b-3c83-32d3-000000000046 15406 1726854957.99386: variable 'ansible_search_path' from source: unknown 15406 1726854957.99391: variable 'ansible_search_path' from source: unknown 15406 1726854957.99475: calling self._execute() 15406 1726854957.99513: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854957.99519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854957.99532: variable 'omit' from source: magic vars 15406 1726854958.00057: variable 'ansible_distribution_major_version' from source: facts 15406 1726854958.00062: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854958.00117: variable 'network_provider' from source: set_fact 15406 1726854958.00121: variable 'network_state' from source: role '' defaults 15406 1726854958.00135: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15406 1726854958.00138: variable 'omit' from source: magic vars 15406 1726854958.00180: variable 'omit' from source: magic vars 15406 1726854958.00218: variable 'network_service_name' from source: role '' defaults 15406 1726854958.00364: variable 'network_service_name' from source: role '' defaults 15406 1726854958.00592: variable '__network_provider_setup' from source: role '' defaults 15406 1726854958.00707: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854958.00890: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854958.00900: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854958.00956: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854958.01443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854958.03995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854958.04002: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854958.04005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854958.04007: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854958.04043: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854958.04131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.04179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.04235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.04289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.04313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.04371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.04406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.04451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.04485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.04560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.04803: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15406 1726854958.04929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.05266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.05269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.05271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.05273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.05275: variable 'ansible_python' from source: facts 15406 1726854958.05312: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15406 1726854958.05402: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854958.05493: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854958.05627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.05661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.05690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.05735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.05760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.05811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.05854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.05881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.05929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.05962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.06096: variable 'network_connections' from source: play vars 15406 1726854958.06180: variable 'profile' from source: play vars 15406 1726854958.06185: variable 'profile' from source: play vars 15406 1726854958.06200: variable 'interface' from source: set_fact 15406 1726854958.06262: variable 'interface' from source: set_fact 15406 1726854958.06403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854958.06686: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854958.06751: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854958.06820: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854958.06874: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854958.06950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854958.06984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854958.07027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.07072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854958.07160: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854958.07443: variable 'network_connections' from source: play vars 15406 1726854958.07455: variable 'profile' from source: play vars 15406 1726854958.07541: variable 'profile' from source: play vars 15406 1726854958.07552: variable 'interface' from source: set_fact 15406 1726854958.07623: variable 'interface' from source: set_fact 15406 1726854958.07708: variable '__network_packages_default_wireless' from source: role '' defaults 15406 1726854958.07746: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854958.08044: variable 'network_connections' from source: play vars 15406 1726854958.08052: variable 'profile' from source: play vars 15406 1726854958.08123: variable 'profile' from source: play vars 15406 1726854958.08139: variable 'interface' from source: set_fact 15406 1726854958.08255: variable 'interface' from source: set_fact 15406 1726854958.08307: variable '__network_packages_default_team' from source: role '' defaults 15406 1726854958.08483: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854958.09195: variable 'network_connections' from source: play vars 15406 1726854958.09201: variable 'profile' from source: play vars 15406 1726854958.09245: variable 'profile' from source: play vars 15406 1726854958.09393: variable 'interface' from source: set_fact 15406 1726854958.09456: variable 'interface' from source: set_fact 15406 1726854958.09557: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854958.09661: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854958.09678: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854958.09792: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854958.10085: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15406 1726854958.11344: variable 'network_connections' from source: play vars 15406 1726854958.11364: variable 'profile' from source: play vars 15406 1726854958.11460: variable 'profile' from source: play vars 15406 1726854958.11500: variable 'interface' from source: set_fact 15406 1726854958.11578: variable 'interface' from source: set_fact 15406 1726854958.11596: variable 'ansible_distribution' from source: facts 15406 1726854958.11608: variable '__network_rh_distros' from source: role '' defaults 15406 1726854958.11618: variable 'ansible_distribution_major_version' from source: facts 15406 1726854958.11655: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15406 1726854958.11836: variable 'ansible_distribution' from source: facts 15406 1726854958.11962: variable '__network_rh_distros' from source: role '' defaults 15406 1726854958.11965: variable 'ansible_distribution_major_version' from source: facts 15406 1726854958.11967: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15406 1726854958.12044: variable 'ansible_distribution' from source: facts 15406 1726854958.12052: variable '__network_rh_distros' from source: role '' defaults 15406 1726854958.12059: variable 'ansible_distribution_major_version' from source: facts 15406 1726854958.12108: variable 'network_provider' from source: set_fact 15406 1726854958.12133: variable 'omit' from source: magic vars 15406 1726854958.12159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854958.12201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854958.12223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854958.12241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854958.12253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854958.12289: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854958.12300: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854958.12307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854958.12404: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854958.12415: Set connection var ansible_timeout to 10 15406 1726854958.12420: Set connection var ansible_connection to ssh 15406 1726854958.12427: Set connection var ansible_shell_type to sh 15406 1726854958.12433: Set connection var ansible_shell_executable to /bin/sh 15406 1726854958.12442: Set connection var ansible_pipelining to False 15406 1726854958.12468: variable 'ansible_shell_executable' from source: unknown 15406 1726854958.12476: variable 'ansible_connection' from source: unknown 15406 1726854958.12483: variable 'ansible_module_compression' from source: unknown 15406 1726854958.12493: variable 'ansible_shell_type' from source: unknown 15406 1726854958.12509: variable 'ansible_shell_executable' from source: unknown 15406 1726854958.12593: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854958.12603: variable 'ansible_pipelining' from source: unknown 15406 1726854958.12605: variable 'ansible_timeout' from source: unknown 15406 1726854958.12613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854958.12662: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854958.12678: variable 'omit' from source: magic vars 15406 1726854958.12692: starting attempt loop 15406 1726854958.12703: running the handler 15406 1726854958.12786: variable 'ansible_facts' from source: unknown 15406 1726854958.13817: _low_level_execute_command(): starting 15406 1726854958.13829: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854958.15028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854958.15072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854958.15100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854958.15149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854958.15168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854958.15179: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854958.15247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854958.15300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854958.15328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854958.15380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854958.15441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854958.17270: stdout chunk (state=3): >>>/root <<< 15406 1726854958.17495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854958.17500: stderr chunk (state=3): >>><<< 15406 1726854958.17507: stdout chunk (state=3): >>><<< 15406 1726854958.17511: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854958.17513: _low_level_execute_command(): starting 15406 1726854958.17522: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525 `" && echo ansible-tmp-1726854958.1749563-16592-277943017586525="` echo /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525 `" ) && sleep 0' 15406 1726854958.18251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854958.18261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854958.18296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854958.18300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854958.18305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854958.18318: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854958.18320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854958.18492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854958.18496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 15406 1726854958.18498: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15406 1726854958.18500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854958.18502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854958.18504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854958.18512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854958.18514: stderr chunk (state=3): >>>debug2: match found <<< 15406 1726854958.18516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854958.18518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854958.18521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854958.18523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854958.18715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854958.20651: stdout chunk (state=3): >>>ansible-tmp-1726854958.1749563-16592-277943017586525=/root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525 <<< 15406 1726854958.20792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854958.20796: stdout chunk (state=3): >>><<< 15406 1726854958.20804: stderr chunk (state=3): >>><<< 15406 1726854958.20996: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854958.1749563-16592-277943017586525=/root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854958.21004: variable 'ansible_module_compression' from source: unknown 15406 1726854958.21007: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15406 1726854958.21118: variable 'ansible_facts' from source: unknown 15406 1726854958.21721: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py 15406 1726854958.21934: Sending initial data 15406 1726854958.21937: Sent initial data (156 bytes) 15406 1726854958.22911: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854958.22985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854958.23024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854958.23131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854958.24679: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854958.24782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854958.24869: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpf43tddcq /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py <<< 15406 1726854958.24873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py" <<< 15406 1726854958.24961: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpf43tddcq" to remote "/root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py" <<< 15406 1726854958.26951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854958.26954: stderr chunk (state=3): >>><<< 15406 1726854958.26956: stdout chunk (state=3): >>><<< 15406 1726854958.27009: done transferring module to remote 15406 1726854958.27021: _low_level_execute_command(): starting 15406 1726854958.27033: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/ /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py && sleep 0' 15406 1726854958.27948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854958.28046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854958.28082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854958.28182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854958.29948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854958.30054: stderr chunk (state=3): >>><<< 15406 1726854958.30057: stdout chunk (state=3): >>><<< 15406 1726854958.30092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854958.30095: _low_level_execute_command(): starting 15406 1726854958.30176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/AnsiballZ_systemd.py && sleep 0' 15406 1726854958.30722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854958.30741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854958.30756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854958.30808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854958.30881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854958.30906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854958.30961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854958.31092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854958.60201: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311087616", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1029043000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15406 1726854958.61771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854958.61784: stdout chunk (state=3): >>><<< 15406 1726854958.61802: stderr chunk (state=3): >>><<< 15406 1726854958.61826: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311087616", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1029043000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854958.62429: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854958.62904: _low_level_execute_command(): starting 15406 1726854958.62908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854958.1749563-16592-277943017586525/ > /dev/null 2>&1 && sleep 0' 15406 1726854958.64253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854958.64270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854958.64482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854958.64500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854958.64590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854958.64604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854958.64659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854958.64768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854958.64806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854958.65001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854958.67395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854958.67403: stdout chunk (state=3): >>><<< 15406 1726854958.67407: stderr chunk (state=3): >>><<< 15406 1726854958.67411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854958.67414: handler run complete 15406 1726854958.67417: attempt loop complete, returning result 15406 1726854958.67420: _execute() done 15406 1726854958.67423: dumping result to json 15406 1726854958.67425: done dumping result, returning 15406 1726854958.67428: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-3c83-32d3-000000000046] 15406 1726854958.67431: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000046 15406 1726854958.68148: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000046 15406 1726854958.68153: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854958.68204: no more pending results, returning what we have 15406 1726854958.68208: results queue empty 15406 1726854958.68209: checking for any_errors_fatal 15406 1726854958.68215: done checking for any_errors_fatal 15406 1726854958.68216: checking for max_fail_percentage 15406 1726854958.68218: done checking for max_fail_percentage 15406 1726854958.68219: checking to see if all hosts have failed and the running result is not ok 15406 1726854958.68220: done checking to see if all hosts have failed 15406 1726854958.68220: getting the remaining hosts for this loop 15406 1726854958.68221: done getting the remaining hosts for this loop 15406 1726854958.68225: getting the next task for host managed_node2 15406 1726854958.68232: done getting next task for host managed_node2 15406 1726854958.68235: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15406 1726854958.68237: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854958.68248: getting variables 15406 1726854958.68250: in VariableManager get_vars() 15406 1726854958.68284: Calling all_inventory to load vars for managed_node2 15406 1726854958.68293: Calling groups_inventory to load vars for managed_node2 15406 1726854958.68296: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854958.68308: Calling all_plugins_play to load vars for managed_node2 15406 1726854958.68311: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854958.68314: Calling groups_plugins_play to load vars for managed_node2 15406 1726854958.71317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854958.74684: done with get_vars() 15406 1726854958.74802: done getting variables 15406 1726854958.74871: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:55:58 -0400 (0:00:00.763) 0:00:26.572 ****** 15406 1726854958.75011: entering _queue_task() for managed_node2/service 15406 1726854958.75736: worker is 1 (out of 1 available) 15406 1726854958.75748: exiting _queue_task() for managed_node2/service 15406 1726854958.75759: done queuing things up, now waiting for results queue to drain 15406 1726854958.75761: waiting for pending results... 15406 1726854958.76344: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15406 1726854958.76392: in run() - task 0affcc66-ac2b-3c83-32d3-000000000047 15406 1726854958.76455: variable 'ansible_search_path' from source: unknown 15406 1726854958.76466: variable 'ansible_search_path' from source: unknown 15406 1726854958.76795: calling self._execute() 15406 1726854958.76801: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854958.76805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854958.76808: variable 'omit' from source: magic vars 15406 1726854958.77579: variable 'ansible_distribution_major_version' from source: facts 15406 1726854958.77676: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854958.77914: variable 'network_provider' from source: set_fact 15406 1726854958.77926: Evaluated conditional (network_provider == "nm"): True 15406 1726854958.78133: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854958.78343: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854958.78600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854958.84295: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854958.84302: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854958.84323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854958.84361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854958.84399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854958.84484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.84530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.84559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.84611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.84636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.84685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.84719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.84752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.84797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.84821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.84870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854958.84905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854958.84934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.84980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854958.85005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854958.85154: variable 'network_connections' from source: play vars 15406 1726854958.85178: variable 'profile' from source: play vars 15406 1726854958.85256: variable 'profile' from source: play vars 15406 1726854958.85266: variable 'interface' from source: set_fact 15406 1726854958.85339: variable 'interface' from source: set_fact 15406 1726854958.85494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854958.85619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854958.85660: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854958.85697: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854958.85738: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854958.85788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854958.85823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854958.85853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854958.85882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854958.85944: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854958.86201: variable 'network_connections' from source: play vars 15406 1726854958.86257: variable 'profile' from source: play vars 15406 1726854958.86284: variable 'profile' from source: play vars 15406 1726854958.86297: variable 'interface' from source: set_fact 15406 1726854958.86365: variable 'interface' from source: set_fact 15406 1726854958.86404: Evaluated conditional (__network_wpa_supplicant_required): False 15406 1726854958.86413: when evaluation is False, skipping this task 15406 1726854958.86592: _execute() done 15406 1726854958.86607: dumping result to json 15406 1726854958.86610: done dumping result, returning 15406 1726854958.86612: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-3c83-32d3-000000000047] 15406 1726854958.86614: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000047 15406 1726854958.86683: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000047 15406 1726854958.86689: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15406 1726854958.86736: no more pending results, returning what we have 15406 1726854958.86744: results queue empty 15406 1726854958.86745: checking for any_errors_fatal 15406 1726854958.86760: done checking for any_errors_fatal 15406 1726854958.86761: checking for max_fail_percentage 15406 1726854958.86763: done checking for max_fail_percentage 15406 1726854958.86763: checking to see if all hosts have failed and the running result is not ok 15406 1726854958.86764: done checking to see if all hosts have failed 15406 1726854958.86765: getting the remaining hosts for this loop 15406 1726854958.86766: done getting the remaining hosts for this loop 15406 1726854958.86769: getting the next task for host managed_node2 15406 1726854958.86775: done getting next task for host managed_node2 15406 1726854958.86778: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15406 1726854958.86780: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854958.86794: getting variables 15406 1726854958.86795: in VariableManager get_vars() 15406 1726854958.86830: Calling all_inventory to load vars for managed_node2 15406 1726854958.86832: Calling groups_inventory to load vars for managed_node2 15406 1726854958.86834: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854958.86843: Calling all_plugins_play to load vars for managed_node2 15406 1726854958.86845: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854958.86847: Calling groups_plugins_play to load vars for managed_node2 15406 1726854958.90173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854958.94785: done with get_vars() 15406 1726854958.94818: done getting variables 15406 1726854958.94998: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:55:58 -0400 (0:00:00.200) 0:00:26.773 ****** 15406 1726854958.95028: entering _queue_task() for managed_node2/service 15406 1726854958.95620: worker is 1 (out of 1 available) 15406 1726854958.95632: exiting _queue_task() for managed_node2/service 15406 1726854958.95737: done queuing things up, now waiting for results queue to drain 15406 1726854958.95738: waiting for pending results... 15406 1726854958.96307: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 15406 1726854958.96595: in run() - task 0affcc66-ac2b-3c83-32d3-000000000048 15406 1726854958.96602: variable 'ansible_search_path' from source: unknown 15406 1726854958.96605: variable 'ansible_search_path' from source: unknown 15406 1726854958.96608: calling self._execute() 15406 1726854958.96790: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854958.96795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854958.96801: variable 'omit' from source: magic vars 15406 1726854958.97060: variable 'ansible_distribution_major_version' from source: facts 15406 1726854958.97094: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854958.97294: variable 'network_provider' from source: set_fact 15406 1726854958.97300: Evaluated conditional (network_provider == "initscripts"): False 15406 1726854958.97303: when evaluation is False, skipping this task 15406 1726854958.97305: _execute() done 15406 1726854958.97308: dumping result to json 15406 1726854958.97310: done dumping result, returning 15406 1726854958.97491: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-3c83-32d3-000000000048] 15406 1726854958.97494: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000048 15406 1726854958.97560: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000048 15406 1726854958.97563: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854958.97602: no more pending results, returning what we have 15406 1726854958.97605: results queue empty 15406 1726854958.97606: checking for any_errors_fatal 15406 1726854958.97613: done checking for any_errors_fatal 15406 1726854958.97614: checking for max_fail_percentage 15406 1726854958.97616: done checking for max_fail_percentage 15406 1726854958.97617: checking to see if all hosts have failed and the running result is not ok 15406 1726854958.97618: done checking to see if all hosts have failed 15406 1726854958.97618: getting the remaining hosts for this loop 15406 1726854958.97620: done getting the remaining hosts for this loop 15406 1726854958.97623: getting the next task for host managed_node2 15406 1726854958.97629: done getting next task for host managed_node2 15406 1726854958.97633: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15406 1726854958.97636: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854958.97649: getting variables 15406 1726854958.97651: in VariableManager get_vars() 15406 1726854958.97689: Calling all_inventory to load vars for managed_node2 15406 1726854958.97692: Calling groups_inventory to load vars for managed_node2 15406 1726854958.97694: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854958.97704: Calling all_plugins_play to load vars for managed_node2 15406 1726854958.97706: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854958.97709: Calling groups_plugins_play to load vars for managed_node2 15406 1726854959.00522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854959.02861: done with get_vars() 15406 1726854959.02895: done getting variables 15406 1726854959.02960: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:55:59 -0400 (0:00:00.079) 0:00:26.852 ****** 15406 1726854959.03007: entering _queue_task() for managed_node2/copy 15406 1726854959.03617: worker is 1 (out of 1 available) 15406 1726854959.03629: exiting _queue_task() for managed_node2/copy 15406 1726854959.03759: done queuing things up, now waiting for results queue to drain 15406 1726854959.03761: waiting for pending results... 15406 1726854959.04980: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15406 1726854959.05080: in run() - task 0affcc66-ac2b-3c83-32d3-000000000049 15406 1726854959.05610: variable 'ansible_search_path' from source: unknown 15406 1726854959.05614: variable 'ansible_search_path' from source: unknown 15406 1726854959.05652: calling self._execute() 15406 1726854959.05934: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.05945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.05949: variable 'omit' from source: magic vars 15406 1726854959.07294: variable 'ansible_distribution_major_version' from source: facts 15406 1726854959.07302: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854959.07481: variable 'network_provider' from source: set_fact 15406 1726854959.07561: Evaluated conditional (network_provider == "initscripts"): False 15406 1726854959.07564: when evaluation is False, skipping this task 15406 1726854959.07566: _execute() done 15406 1726854959.07569: dumping result to json 15406 1726854959.07570: done dumping result, returning 15406 1726854959.07574: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-3c83-32d3-000000000049] 15406 1726854959.07576: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000049 15406 1726854959.07650: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000049 15406 1726854959.07653: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15406 1726854959.07716: no more pending results, returning what we have 15406 1726854959.07720: results queue empty 15406 1726854959.07722: checking for any_errors_fatal 15406 1726854959.07730: done checking for any_errors_fatal 15406 1726854959.07730: checking for max_fail_percentage 15406 1726854959.07732: done checking for max_fail_percentage 15406 1726854959.07733: checking to see if all hosts have failed and the running result is not ok 15406 1726854959.07734: done checking to see if all hosts have failed 15406 1726854959.07734: getting the remaining hosts for this loop 15406 1726854959.07736: done getting the remaining hosts for this loop 15406 1726854959.07740: getting the next task for host managed_node2 15406 1726854959.07746: done getting next task for host managed_node2 15406 1726854959.07751: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15406 1726854959.07753: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854959.07768: getting variables 15406 1726854959.07769: in VariableManager get_vars() 15406 1726854959.07811: Calling all_inventory to load vars for managed_node2 15406 1726854959.07813: Calling groups_inventory to load vars for managed_node2 15406 1726854959.07816: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854959.07827: Calling all_plugins_play to load vars for managed_node2 15406 1726854959.07829: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854959.07831: Calling groups_plugins_play to load vars for managed_node2 15406 1726854959.10945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854959.15075: done with get_vars() 15406 1726854959.15123: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:55:59 -0400 (0:00:00.122) 0:00:26.974 ****** 15406 1726854959.15296: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15406 1726854959.16043: worker is 1 (out of 1 available) 15406 1726854959.16057: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15406 1726854959.16069: done queuing things up, now waiting for results queue to drain 15406 1726854959.16071: waiting for pending results... 15406 1726854959.16713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15406 1726854959.17116: in run() - task 0affcc66-ac2b-3c83-32d3-00000000004a 15406 1726854959.17121: variable 'ansible_search_path' from source: unknown 15406 1726854959.17124: variable 'ansible_search_path' from source: unknown 15406 1726854959.17127: calling self._execute() 15406 1726854959.17290: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.17307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.17311: variable 'omit' from source: magic vars 15406 1726854959.18602: variable 'ansible_distribution_major_version' from source: facts 15406 1726854959.18606: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854959.18610: variable 'omit' from source: magic vars 15406 1726854959.18612: variable 'omit' from source: magic vars 15406 1726854959.18769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854959.21338: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854959.21538: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854959.21575: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854959.21768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854959.21863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854959.21949: variable 'network_provider' from source: set_fact 15406 1726854959.22311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854959.22407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854959.22411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854959.22415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854959.22429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854959.22894: variable 'omit' from source: magic vars 15406 1726854959.23269: variable 'omit' from source: magic vars 15406 1726854959.23504: variable 'network_connections' from source: play vars 15406 1726854959.23545: variable 'profile' from source: play vars 15406 1726854959.23610: variable 'profile' from source: play vars 15406 1726854959.23614: variable 'interface' from source: set_fact 15406 1726854959.23681: variable 'interface' from source: set_fact 15406 1726854959.23826: variable 'omit' from source: magic vars 15406 1726854959.23834: variable '__lsr_ansible_managed' from source: task vars 15406 1726854959.23902: variable '__lsr_ansible_managed' from source: task vars 15406 1726854959.24094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15406 1726854959.24649: Loaded config def from plugin (lookup/template) 15406 1726854959.24652: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15406 1726854959.24681: File lookup term: get_ansible_managed.j2 15406 1726854959.24684: variable 'ansible_search_path' from source: unknown 15406 1726854959.24690: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15406 1726854959.24704: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15406 1726854959.24728: variable 'ansible_search_path' from source: unknown 15406 1726854959.32867: variable 'ansible_managed' from source: unknown 15406 1726854959.32942: variable 'omit' from source: magic vars 15406 1726854959.32995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854959.33034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854959.33059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854959.33088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854959.33109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854959.33146: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854959.33149: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.33151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.33256: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854959.33259: Set connection var ansible_timeout to 10 15406 1726854959.33262: Set connection var ansible_connection to ssh 15406 1726854959.33264: Set connection var ansible_shell_type to sh 15406 1726854959.33270: Set connection var ansible_shell_executable to /bin/sh 15406 1726854959.33299: Set connection var ansible_pipelining to False 15406 1726854959.33304: variable 'ansible_shell_executable' from source: unknown 15406 1726854959.33307: variable 'ansible_connection' from source: unknown 15406 1726854959.33311: variable 'ansible_module_compression' from source: unknown 15406 1726854959.33314: variable 'ansible_shell_type' from source: unknown 15406 1726854959.33316: variable 'ansible_shell_executable' from source: unknown 15406 1726854959.33318: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.33408: variable 'ansible_pipelining' from source: unknown 15406 1726854959.33411: variable 'ansible_timeout' from source: unknown 15406 1726854959.33414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.33477: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854959.33490: variable 'omit' from source: magic vars 15406 1726854959.33496: starting attempt loop 15406 1726854959.33499: running the handler 15406 1726854959.33516: _low_level_execute_command(): starting 15406 1726854959.33519: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854959.34206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854959.34223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854959.34309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854959.34346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854959.34357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854959.34366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854959.34475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854959.36193: stdout chunk (state=3): >>>/root <<< 15406 1726854959.36344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854959.36347: stdout chunk (state=3): >>><<< 15406 1726854959.36350: stderr chunk (state=3): >>><<< 15406 1726854959.36370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854959.36392: _low_level_execute_command(): starting 15406 1726854959.36473: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441 `" && echo ansible-tmp-1726854959.3637705-16647-262353704741441="` echo /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441 `" ) && sleep 0' 15406 1726854959.36982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854959.37000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854959.37016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854959.37035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854959.37054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854959.37183: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854959.37210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854959.37357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854959.39263: stdout chunk (state=3): >>>ansible-tmp-1726854959.3637705-16647-262353704741441=/root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441 <<< 15406 1726854959.39463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854959.39467: stdout chunk (state=3): >>><<< 15406 1726854959.39470: stderr chunk (state=3): >>><<< 15406 1726854959.39724: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854959.3637705-16647-262353704741441=/root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854959.39728: variable 'ansible_module_compression' from source: unknown 15406 1726854959.39730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15406 1726854959.39733: variable 'ansible_facts' from source: unknown 15406 1726854959.40246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py 15406 1726854959.40557: Sending initial data 15406 1726854959.40560: Sent initial data (168 bytes) 15406 1726854959.41465: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854959.41488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854959.41619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854959.41631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854959.41983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854959.42089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854959.42263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854959.43837: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15406 1726854959.43851: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15406 1726854959.43864: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15406 1726854959.43877: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15406 1726854959.43902: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854959.44016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854959.44111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp3689go4z /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py <<< 15406 1726854959.44114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py" <<< 15406 1726854959.44169: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp3689go4z" to remote "/root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py" <<< 15406 1726854959.45416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854959.45508: stderr chunk (state=3): >>><<< 15406 1726854959.45511: stdout chunk (state=3): >>><<< 15406 1726854959.45513: done transferring module to remote 15406 1726854959.45515: _low_level_execute_command(): starting 15406 1726854959.45517: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/ /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py && sleep 0' 15406 1726854959.46922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854959.47112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854959.47241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854959.47335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854959.49249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854959.49253: stdout chunk (state=3): >>><<< 15406 1726854959.49307: stderr chunk (state=3): >>><<< 15406 1726854959.49311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854959.49314: _low_level_execute_command(): starting 15406 1726854959.49316: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/AnsiballZ_network_connections.py && sleep 0' 15406 1726854959.51037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854959.51128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854959.51214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854959.80793: stdout chunk (state=3): >>> <<< 15406 1726854959.80819: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15406 1726854959.83041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854959.83066: stdout chunk (state=3): >>><<< 15406 1726854959.83069: stderr chunk (state=3): >>><<< 15406 1726854959.83094: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854959.83134: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854959.83146: _low_level_execute_command(): starting 15406 1726854959.83175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854959.3637705-16647-262353704741441/ > /dev/null 2>&1 && sleep 0' 15406 1726854959.83763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854959.83778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854959.83793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854959.83812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854959.83829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854959.83940: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854959.83943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854959.83970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854959.84075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854959.85990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854959.86015: stdout chunk (state=3): >>><<< 15406 1726854959.86018: stderr chunk (state=3): >>><<< 15406 1726854959.86094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854959.86097: handler run complete 15406 1726854959.86099: attempt loop complete, returning result 15406 1726854959.86102: _execute() done 15406 1726854959.86104: dumping result to json 15406 1726854959.86106: done dumping result, returning 15406 1726854959.86108: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-3c83-32d3-00000000004a] 15406 1726854959.86110: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004a 15406 1726854959.86277: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004a 15406 1726854959.86280: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15406 1726854959.86382: no more pending results, returning what we have 15406 1726854959.86385: results queue empty 15406 1726854959.86386: checking for any_errors_fatal 15406 1726854959.86394: done checking for any_errors_fatal 15406 1726854959.86395: checking for max_fail_percentage 15406 1726854959.86397: done checking for max_fail_percentage 15406 1726854959.86398: checking to see if all hosts have failed and the running result is not ok 15406 1726854959.86399: done checking to see if all hosts have failed 15406 1726854959.86399: getting the remaining hosts for this loop 15406 1726854959.86401: done getting the remaining hosts for this loop 15406 1726854959.86404: getting the next task for host managed_node2 15406 1726854959.86410: done getting next task for host managed_node2 15406 1726854959.86414: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15406 1726854959.86416: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854959.86425: getting variables 15406 1726854959.86426: in VariableManager get_vars() 15406 1726854959.86463: Calling all_inventory to load vars for managed_node2 15406 1726854959.86466: Calling groups_inventory to load vars for managed_node2 15406 1726854959.86468: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854959.86479: Calling all_plugins_play to load vars for managed_node2 15406 1726854959.86482: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854959.86485: Calling groups_plugins_play to load vars for managed_node2 15406 1726854959.88452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854959.90042: done with get_vars() 15406 1726854959.90065: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:55:59 -0400 (0:00:00.749) 0:00:27.724 ****** 15406 1726854959.90156: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15406 1726854959.90495: worker is 1 (out of 1 available) 15406 1726854959.90508: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15406 1726854959.90523: done queuing things up, now waiting for results queue to drain 15406 1726854959.90524: waiting for pending results... 15406 1726854959.90890: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 15406 1726854959.90896: in run() - task 0affcc66-ac2b-3c83-32d3-00000000004b 15406 1726854959.90899: variable 'ansible_search_path' from source: unknown 15406 1726854959.90902: variable 'ansible_search_path' from source: unknown 15406 1726854959.90911: calling self._execute() 15406 1726854959.91009: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.91013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.91026: variable 'omit' from source: magic vars 15406 1726854959.91392: variable 'ansible_distribution_major_version' from source: facts 15406 1726854959.91423: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854959.91531: variable 'network_state' from source: role '' defaults 15406 1726854959.91534: Evaluated conditional (network_state != {}): False 15406 1726854959.91537: when evaluation is False, skipping this task 15406 1726854959.91539: _execute() done 15406 1726854959.91542: dumping result to json 15406 1726854959.91544: done dumping result, returning 15406 1726854959.91593: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-3c83-32d3-00000000004b] 15406 1726854959.91596: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004b 15406 1726854959.91818: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004b 15406 1726854959.91821: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854959.91861: no more pending results, returning what we have 15406 1726854959.91863: results queue empty 15406 1726854959.91864: checking for any_errors_fatal 15406 1726854959.91871: done checking for any_errors_fatal 15406 1726854959.91872: checking for max_fail_percentage 15406 1726854959.91874: done checking for max_fail_percentage 15406 1726854959.91874: checking to see if all hosts have failed and the running result is not ok 15406 1726854959.91875: done checking to see if all hosts have failed 15406 1726854959.91876: getting the remaining hosts for this loop 15406 1726854959.91877: done getting the remaining hosts for this loop 15406 1726854959.91880: getting the next task for host managed_node2 15406 1726854959.91885: done getting next task for host managed_node2 15406 1726854959.91892: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15406 1726854959.91895: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854959.91907: getting variables 15406 1726854959.91908: in VariableManager get_vars() 15406 1726854959.91944: Calling all_inventory to load vars for managed_node2 15406 1726854959.91947: Calling groups_inventory to load vars for managed_node2 15406 1726854959.91950: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854959.91958: Calling all_plugins_play to load vars for managed_node2 15406 1726854959.91961: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854959.91964: Calling groups_plugins_play to load vars for managed_node2 15406 1726854959.93248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854959.94862: done with get_vars() 15406 1726854959.94893: done getting variables 15406 1726854959.94951: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:55:59 -0400 (0:00:00.048) 0:00:27.772 ****** 15406 1726854959.94990: entering _queue_task() for managed_node2/debug 15406 1726854959.95498: worker is 1 (out of 1 available) 15406 1726854959.95509: exiting _queue_task() for managed_node2/debug 15406 1726854959.95519: done queuing things up, now waiting for results queue to drain 15406 1726854959.95520: waiting for pending results... 15406 1726854959.95652: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15406 1726854959.95730: in run() - task 0affcc66-ac2b-3c83-32d3-00000000004c 15406 1726854959.95859: variable 'ansible_search_path' from source: unknown 15406 1726854959.95863: variable 'ansible_search_path' from source: unknown 15406 1726854959.95866: calling self._execute() 15406 1726854959.95907: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.95919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.95934: variable 'omit' from source: magic vars 15406 1726854959.96328: variable 'ansible_distribution_major_version' from source: facts 15406 1726854959.96346: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854959.96357: variable 'omit' from source: magic vars 15406 1726854959.96401: variable 'omit' from source: magic vars 15406 1726854959.96448: variable 'omit' from source: magic vars 15406 1726854959.96493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854959.96539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854959.96562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854959.96584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854959.96603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854959.96643: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854959.96650: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.96657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.96839: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854959.96842: Set connection var ansible_timeout to 10 15406 1726854959.96844: Set connection var ansible_connection to ssh 15406 1726854959.96847: Set connection var ansible_shell_type to sh 15406 1726854959.96849: Set connection var ansible_shell_executable to /bin/sh 15406 1726854959.96851: Set connection var ansible_pipelining to False 15406 1726854959.96853: variable 'ansible_shell_executable' from source: unknown 15406 1726854959.96855: variable 'ansible_connection' from source: unknown 15406 1726854959.96857: variable 'ansible_module_compression' from source: unknown 15406 1726854959.96859: variable 'ansible_shell_type' from source: unknown 15406 1726854959.96861: variable 'ansible_shell_executable' from source: unknown 15406 1726854959.96863: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854959.96865: variable 'ansible_pipelining' from source: unknown 15406 1726854959.96874: variable 'ansible_timeout' from source: unknown 15406 1726854959.96882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854959.97030: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854959.97046: variable 'omit' from source: magic vars 15406 1726854959.97166: starting attempt loop 15406 1726854959.97170: running the handler 15406 1726854959.97204: variable '__network_connections_result' from source: set_fact 15406 1726854959.97256: handler run complete 15406 1726854959.97286: attempt loop complete, returning result 15406 1726854959.97298: _execute() done 15406 1726854959.97306: dumping result to json 15406 1726854959.97314: done dumping result, returning 15406 1726854959.97327: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-3c83-32d3-00000000004c] 15406 1726854959.97336: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004c 15406 1726854959.97450: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004c 15406 1726854959.97453: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 15406 1726854959.97550: no more pending results, returning what we have 15406 1726854959.97554: results queue empty 15406 1726854959.97555: checking for any_errors_fatal 15406 1726854959.97560: done checking for any_errors_fatal 15406 1726854959.97561: checking for max_fail_percentage 15406 1726854959.97563: done checking for max_fail_percentage 15406 1726854959.97563: checking to see if all hosts have failed and the running result is not ok 15406 1726854959.97564: done checking to see if all hosts have failed 15406 1726854959.97565: getting the remaining hosts for this loop 15406 1726854959.97567: done getting the remaining hosts for this loop 15406 1726854959.97571: getting the next task for host managed_node2 15406 1726854959.97577: done getting next task for host managed_node2 15406 1726854959.97580: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15406 1726854959.97583: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854959.97595: getting variables 15406 1726854959.97597: in VariableManager get_vars() 15406 1726854959.97633: Calling all_inventory to load vars for managed_node2 15406 1726854959.97636: Calling groups_inventory to load vars for managed_node2 15406 1726854959.97639: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854959.97649: Calling all_plugins_play to load vars for managed_node2 15406 1726854959.97652: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854959.97655: Calling groups_plugins_play to load vars for managed_node2 15406 1726854959.99443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.01046: done with get_vars() 15406 1726854960.01076: done getting variables 15406 1726854960.01139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:56:00 -0400 (0:00:00.061) 0:00:27.834 ****** 15406 1726854960.01174: entering _queue_task() for managed_node2/debug 15406 1726854960.01634: worker is 1 (out of 1 available) 15406 1726854960.01647: exiting _queue_task() for managed_node2/debug 15406 1726854960.01657: done queuing things up, now waiting for results queue to drain 15406 1726854960.01658: waiting for pending results... 15406 1726854960.01949: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15406 1726854960.01959: in run() - task 0affcc66-ac2b-3c83-32d3-00000000004d 15406 1726854960.01978: variable 'ansible_search_path' from source: unknown 15406 1726854960.01986: variable 'ansible_search_path' from source: unknown 15406 1726854960.02031: calling self._execute() 15406 1726854960.02155: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.02159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.02213: variable 'omit' from source: magic vars 15406 1726854960.02572: variable 'ansible_distribution_major_version' from source: facts 15406 1726854960.02595: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854960.02607: variable 'omit' from source: magic vars 15406 1726854960.02650: variable 'omit' from source: magic vars 15406 1726854960.02697: variable 'omit' from source: magic vars 15406 1726854960.02741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854960.02803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854960.02817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854960.02840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854960.02864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854960.02911: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854960.02914: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.02916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.03021: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854960.03080: Set connection var ansible_timeout to 10 15406 1726854960.03083: Set connection var ansible_connection to ssh 15406 1726854960.03085: Set connection var ansible_shell_type to sh 15406 1726854960.03091: Set connection var ansible_shell_executable to /bin/sh 15406 1726854960.03093: Set connection var ansible_pipelining to False 15406 1726854960.03095: variable 'ansible_shell_executable' from source: unknown 15406 1726854960.03103: variable 'ansible_connection' from source: unknown 15406 1726854960.03111: variable 'ansible_module_compression' from source: unknown 15406 1726854960.03117: variable 'ansible_shell_type' from source: unknown 15406 1726854960.03128: variable 'ansible_shell_executable' from source: unknown 15406 1726854960.03237: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.03240: variable 'ansible_pipelining' from source: unknown 15406 1726854960.03242: variable 'ansible_timeout' from source: unknown 15406 1726854960.03245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.03302: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854960.03318: variable 'omit' from source: magic vars 15406 1726854960.03328: starting attempt loop 15406 1726854960.03334: running the handler 15406 1726854960.03395: variable '__network_connections_result' from source: set_fact 15406 1726854960.03485: variable '__network_connections_result' from source: set_fact 15406 1726854960.03602: handler run complete 15406 1726854960.03630: attempt loop complete, returning result 15406 1726854960.03637: _execute() done 15406 1726854960.03644: dumping result to json 15406 1726854960.03652: done dumping result, returning 15406 1726854960.03663: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-3c83-32d3-00000000004d] 15406 1726854960.03679: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004d 15406 1726854960.03850: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004d 15406 1726854960.03853: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15406 1726854960.03946: no more pending results, returning what we have 15406 1726854960.03950: results queue empty 15406 1726854960.03951: checking for any_errors_fatal 15406 1726854960.03958: done checking for any_errors_fatal 15406 1726854960.03958: checking for max_fail_percentage 15406 1726854960.03961: done checking for max_fail_percentage 15406 1726854960.03961: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.03962: done checking to see if all hosts have failed 15406 1726854960.03963: getting the remaining hosts for this loop 15406 1726854960.03964: done getting the remaining hosts for this loop 15406 1726854960.03969: getting the next task for host managed_node2 15406 1726854960.03976: done getting next task for host managed_node2 15406 1726854960.03979: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15406 1726854960.03982: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.03994: getting variables 15406 1726854960.04090: in VariableManager get_vars() 15406 1726854960.04135: Calling all_inventory to load vars for managed_node2 15406 1726854960.04139: Calling groups_inventory to load vars for managed_node2 15406 1726854960.04141: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.04152: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.04155: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.04158: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.05720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.07391: done with get_vars() 15406 1726854960.07427: done getting variables 15406 1726854960.07491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:56:00 -0400 (0:00:00.063) 0:00:27.897 ****** 15406 1726854960.07534: entering _queue_task() for managed_node2/debug 15406 1726854960.07993: worker is 1 (out of 1 available) 15406 1726854960.08003: exiting _queue_task() for managed_node2/debug 15406 1726854960.08013: done queuing things up, now waiting for results queue to drain 15406 1726854960.08015: waiting for pending results... 15406 1726854960.08252: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15406 1726854960.08349: in run() - task 0affcc66-ac2b-3c83-32d3-00000000004e 15406 1726854960.08353: variable 'ansible_search_path' from source: unknown 15406 1726854960.08356: variable 'ansible_search_path' from source: unknown 15406 1726854960.08381: calling self._execute() 15406 1726854960.08512: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.08515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.08518: variable 'omit' from source: magic vars 15406 1726854960.08917: variable 'ansible_distribution_major_version' from source: facts 15406 1726854960.08933: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854960.09164: variable 'network_state' from source: role '' defaults 15406 1726854960.09167: Evaluated conditional (network_state != {}): False 15406 1726854960.09169: when evaluation is False, skipping this task 15406 1726854960.09172: _execute() done 15406 1726854960.09174: dumping result to json 15406 1726854960.09176: done dumping result, returning 15406 1726854960.09179: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-3c83-32d3-00000000004e] 15406 1726854960.09181: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004e 15406 1726854960.09247: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004e 15406 1726854960.09251: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 15406 1726854960.09307: no more pending results, returning what we have 15406 1726854960.09311: results queue empty 15406 1726854960.09312: checking for any_errors_fatal 15406 1726854960.09321: done checking for any_errors_fatal 15406 1726854960.09322: checking for max_fail_percentage 15406 1726854960.09325: done checking for max_fail_percentage 15406 1726854960.09325: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.09326: done checking to see if all hosts have failed 15406 1726854960.09327: getting the remaining hosts for this loop 15406 1726854960.09328: done getting the remaining hosts for this loop 15406 1726854960.09332: getting the next task for host managed_node2 15406 1726854960.09338: done getting next task for host managed_node2 15406 1726854960.09342: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15406 1726854960.09345: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.09359: getting variables 15406 1726854960.09361: in VariableManager get_vars() 15406 1726854960.09517: Calling all_inventory to load vars for managed_node2 15406 1726854960.09520: Calling groups_inventory to load vars for managed_node2 15406 1726854960.09522: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.09536: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.09539: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.09542: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.11229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.12868: done with get_vars() 15406 1726854960.12901: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:56:00 -0400 (0:00:00.054) 0:00:27.952 ****** 15406 1726854960.13013: entering _queue_task() for managed_node2/ping 15406 1726854960.13365: worker is 1 (out of 1 available) 15406 1726854960.13379: exiting _queue_task() for managed_node2/ping 15406 1726854960.13503: done queuing things up, now waiting for results queue to drain 15406 1726854960.13505: waiting for pending results... 15406 1726854960.13804: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15406 1726854960.13809: in run() - task 0affcc66-ac2b-3c83-32d3-00000000004f 15406 1726854960.13812: variable 'ansible_search_path' from source: unknown 15406 1726854960.13814: variable 'ansible_search_path' from source: unknown 15406 1726854960.13857: calling self._execute() 15406 1726854960.13962: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.13972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.13984: variable 'omit' from source: magic vars 15406 1726854960.14380: variable 'ansible_distribution_major_version' from source: facts 15406 1726854960.14399: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854960.14409: variable 'omit' from source: magic vars 15406 1726854960.14446: variable 'omit' from source: magic vars 15406 1726854960.14590: variable 'omit' from source: magic vars 15406 1726854960.14594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854960.14596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854960.14598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854960.14618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854960.14633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854960.14664: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854960.14671: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.14678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.14785: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854960.14799: Set connection var ansible_timeout to 10 15406 1726854960.14806: Set connection var ansible_connection to ssh 15406 1726854960.14819: Set connection var ansible_shell_type to sh 15406 1726854960.14832: Set connection var ansible_shell_executable to /bin/sh 15406 1726854960.14842: Set connection var ansible_pipelining to False 15406 1726854960.14868: variable 'ansible_shell_executable' from source: unknown 15406 1726854960.14933: variable 'ansible_connection' from source: unknown 15406 1726854960.14936: variable 'ansible_module_compression' from source: unknown 15406 1726854960.14938: variable 'ansible_shell_type' from source: unknown 15406 1726854960.14940: variable 'ansible_shell_executable' from source: unknown 15406 1726854960.14942: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.14944: variable 'ansible_pipelining' from source: unknown 15406 1726854960.14946: variable 'ansible_timeout' from source: unknown 15406 1726854960.14948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.15116: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854960.15132: variable 'omit' from source: magic vars 15406 1726854960.15140: starting attempt loop 15406 1726854960.15154: running the handler 15406 1726854960.15171: _low_level_execute_command(): starting 15406 1726854960.15181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854960.15994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854960.16015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.16049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.16065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.16086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.16193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.17990: stdout chunk (state=3): >>>/root <<< 15406 1726854960.18140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.18144: stdout chunk (state=3): >>><<< 15406 1726854960.18146: stderr chunk (state=3): >>><<< 15406 1726854960.18168: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.18192: _low_level_execute_command(): starting 15406 1726854960.18280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762 `" && echo ansible-tmp-1726854960.181771-16717-228479448337762="` echo /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762 `" ) && sleep 0' 15406 1726854960.18861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854960.18875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854960.18893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.18953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.19024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.19077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.19156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.21100: stdout chunk (state=3): >>>ansible-tmp-1726854960.181771-16717-228479448337762=/root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762 <<< 15406 1726854960.21260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.21264: stdout chunk (state=3): >>><<< 15406 1726854960.21267: stderr chunk (state=3): >>><<< 15406 1726854960.21493: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854960.181771-16717-228479448337762=/root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.21497: variable 'ansible_module_compression' from source: unknown 15406 1726854960.21499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15406 1726854960.21502: variable 'ansible_facts' from source: unknown 15406 1726854960.21515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py 15406 1726854960.21750: Sending initial data 15406 1726854960.21753: Sent initial data (152 bytes) 15406 1726854960.22320: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854960.22335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854960.22395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.22458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.22482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.22506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.22614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.24209: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854960.24292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854960.24373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpr_dzetth /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py <<< 15406 1726854960.24377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py" <<< 15406 1726854960.24447: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpr_dzetth" to remote "/root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py" <<< 15406 1726854960.25294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.25333: stderr chunk (state=3): >>><<< 15406 1726854960.25343: stdout chunk (state=3): >>><<< 15406 1726854960.25420: done transferring module to remote 15406 1726854960.25437: _low_level_execute_command(): starting 15406 1726854960.25447: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/ /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py && sleep 0' 15406 1726854960.26105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854960.26208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854960.26211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.26249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.26266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.26286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.26392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.28200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.28234: stdout chunk (state=3): >>><<< 15406 1726854960.28237: stderr chunk (state=3): >>><<< 15406 1726854960.28337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.28341: _low_level_execute_command(): starting 15406 1726854960.28343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/AnsiballZ_ping.py && sleep 0' 15406 1726854960.28904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854960.28918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854960.28943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.28962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854960.29005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.29079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.29112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.29145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.29223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.44046: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15406 1726854960.45328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854960.45335: stdout chunk (state=3): >>><<< 15406 1726854960.45337: stderr chunk (state=3): >>><<< 15406 1726854960.45473: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854960.45477: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854960.45480: _low_level_execute_command(): starting 15406 1726854960.45482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854960.181771-16717-228479448337762/ > /dev/null 2>&1 && sleep 0' 15406 1726854960.46058: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854960.46073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854960.46092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.46144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.46210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.46227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.46259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.46362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.48263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.48283: stdout chunk (state=3): >>><<< 15406 1726854960.48286: stderr chunk (state=3): >>><<< 15406 1726854960.48306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.48497: handler run complete 15406 1726854960.48500: attempt loop complete, returning result 15406 1726854960.48503: _execute() done 15406 1726854960.48505: dumping result to json 15406 1726854960.48507: done dumping result, returning 15406 1726854960.48509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-3c83-32d3-00000000004f] 15406 1726854960.48511: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004f 15406 1726854960.48576: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000004f 15406 1726854960.48579: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 15406 1726854960.48658: no more pending results, returning what we have 15406 1726854960.48662: results queue empty 15406 1726854960.48663: checking for any_errors_fatal 15406 1726854960.48668: done checking for any_errors_fatal 15406 1726854960.48669: checking for max_fail_percentage 15406 1726854960.48671: done checking for max_fail_percentage 15406 1726854960.48672: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.48673: done checking to see if all hosts have failed 15406 1726854960.48674: getting the remaining hosts for this loop 15406 1726854960.48675: done getting the remaining hosts for this loop 15406 1726854960.48680: getting the next task for host managed_node2 15406 1726854960.48697: done getting next task for host managed_node2 15406 1726854960.48700: ^ task is: TASK: meta (role_complete) 15406 1726854960.48702: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.48712: getting variables 15406 1726854960.48714: in VariableManager get_vars() 15406 1726854960.48754: Calling all_inventory to load vars for managed_node2 15406 1726854960.48757: Calling groups_inventory to load vars for managed_node2 15406 1726854960.48760: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.48770: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.48773: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.48776: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.50454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.52289: done with get_vars() 15406 1726854960.52311: done getting variables 15406 1726854960.52408: done queuing things up, now waiting for results queue to drain 15406 1726854960.52411: results queue empty 15406 1726854960.52411: checking for any_errors_fatal 15406 1726854960.52414: done checking for any_errors_fatal 15406 1726854960.52415: checking for max_fail_percentage 15406 1726854960.52416: done checking for max_fail_percentage 15406 1726854960.52417: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.52417: done checking to see if all hosts have failed 15406 1726854960.52418: getting the remaining hosts for this loop 15406 1726854960.52419: done getting the remaining hosts for this loop 15406 1726854960.52421: getting the next task for host managed_node2 15406 1726854960.52425: done getting next task for host managed_node2 15406 1726854960.52427: ^ task is: TASK: meta (flush_handlers) 15406 1726854960.52428: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.52431: getting variables 15406 1726854960.52432: in VariableManager get_vars() 15406 1726854960.52442: Calling all_inventory to load vars for managed_node2 15406 1726854960.52445: Calling groups_inventory to load vars for managed_node2 15406 1726854960.52447: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.52451: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.52453: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.52462: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.53667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.55322: done with get_vars() 15406 1726854960.55342: done getting variables 15406 1726854960.55390: in VariableManager get_vars() 15406 1726854960.55402: Calling all_inventory to load vars for managed_node2 15406 1726854960.55405: Calling groups_inventory to load vars for managed_node2 15406 1726854960.55407: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.55412: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.55415: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.55417: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.56720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.58370: done with get_vars() 15406 1726854960.58397: done queuing things up, now waiting for results queue to drain 15406 1726854960.58399: results queue empty 15406 1726854960.58400: checking for any_errors_fatal 15406 1726854960.58401: done checking for any_errors_fatal 15406 1726854960.58402: checking for max_fail_percentage 15406 1726854960.58403: done checking for max_fail_percentage 15406 1726854960.58403: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.58404: done checking to see if all hosts have failed 15406 1726854960.58405: getting the remaining hosts for this loop 15406 1726854960.58406: done getting the remaining hosts for this loop 15406 1726854960.58408: getting the next task for host managed_node2 15406 1726854960.58412: done getting next task for host managed_node2 15406 1726854960.58413: ^ task is: TASK: meta (flush_handlers) 15406 1726854960.58415: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.58417: getting variables 15406 1726854960.58418: in VariableManager get_vars() 15406 1726854960.58429: Calling all_inventory to load vars for managed_node2 15406 1726854960.58431: Calling groups_inventory to load vars for managed_node2 15406 1726854960.58433: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.58438: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.58440: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.58443: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.59623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.61168: done with get_vars() 15406 1726854960.61186: done getting variables 15406 1726854960.61225: in VariableManager get_vars() 15406 1726854960.61235: Calling all_inventory to load vars for managed_node2 15406 1726854960.61237: Calling groups_inventory to load vars for managed_node2 15406 1726854960.61238: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.61241: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.61243: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.61244: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.61923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.62778: done with get_vars() 15406 1726854960.62805: done queuing things up, now waiting for results queue to drain 15406 1726854960.62807: results queue empty 15406 1726854960.62808: checking for any_errors_fatal 15406 1726854960.62809: done checking for any_errors_fatal 15406 1726854960.62810: checking for max_fail_percentage 15406 1726854960.62814: done checking for max_fail_percentage 15406 1726854960.62815: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.62815: done checking to see if all hosts have failed 15406 1726854960.62816: getting the remaining hosts for this loop 15406 1726854960.62817: done getting the remaining hosts for this loop 15406 1726854960.62820: getting the next task for host managed_node2 15406 1726854960.62824: done getting next task for host managed_node2 15406 1726854960.62824: ^ task is: None 15406 1726854960.62826: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.62827: done queuing things up, now waiting for results queue to drain 15406 1726854960.62828: results queue empty 15406 1726854960.62828: checking for any_errors_fatal 15406 1726854960.62829: done checking for any_errors_fatal 15406 1726854960.62830: checking for max_fail_percentage 15406 1726854960.62831: done checking for max_fail_percentage 15406 1726854960.62831: checking to see if all hosts have failed and the running result is not ok 15406 1726854960.62832: done checking to see if all hosts have failed 15406 1726854960.62833: getting the next task for host managed_node2 15406 1726854960.62835: done getting next task for host managed_node2 15406 1726854960.62836: ^ task is: None 15406 1726854960.62837: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.62874: in VariableManager get_vars() 15406 1726854960.62899: done with get_vars() 15406 1726854960.62908: in VariableManager get_vars() 15406 1726854960.62922: done with get_vars() 15406 1726854960.62927: variable 'omit' from source: magic vars 15406 1726854960.62953: in VariableManager get_vars() 15406 1726854960.62960: done with get_vars() 15406 1726854960.62974: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15406 1726854960.63138: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854960.63160: getting the remaining hosts for this loop 15406 1726854960.63161: done getting the remaining hosts for this loop 15406 1726854960.63164: getting the next task for host managed_node2 15406 1726854960.63166: done getting next task for host managed_node2 15406 1726854960.63168: ^ task is: TASK: Gathering Facts 15406 1726854960.63170: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854960.63172: getting variables 15406 1726854960.63173: in VariableManager get_vars() 15406 1726854960.63181: Calling all_inventory to load vars for managed_node2 15406 1726854960.63183: Calling groups_inventory to load vars for managed_node2 15406 1726854960.63185: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854960.63193: Calling all_plugins_play to load vars for managed_node2 15406 1726854960.63195: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854960.63198: Calling groups_plugins_play to load vars for managed_node2 15406 1726854960.64275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854960.68451: done with get_vars() 15406 1726854960.68471: done getting variables 15406 1726854960.68522: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 13:56:00 -0400 (0:00:00.555) 0:00:28.508 ****** 15406 1726854960.68549: entering _queue_task() for managed_node2/gather_facts 15406 1726854960.68897: worker is 1 (out of 1 available) 15406 1726854960.68911: exiting _queue_task() for managed_node2/gather_facts 15406 1726854960.68925: done queuing things up, now waiting for results queue to drain 15406 1726854960.68926: waiting for pending results... 15406 1726854960.69180: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854960.69391: in run() - task 0affcc66-ac2b-3c83-32d3-000000000382 15406 1726854960.69395: variable 'ansible_search_path' from source: unknown 15406 1726854960.69399: calling self._execute() 15406 1726854960.69524: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.69540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.69554: variable 'omit' from source: magic vars 15406 1726854960.69889: variable 'ansible_distribution_major_version' from source: facts 15406 1726854960.69898: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854960.69907: variable 'omit' from source: magic vars 15406 1726854960.69936: variable 'omit' from source: magic vars 15406 1726854960.69958: variable 'omit' from source: magic vars 15406 1726854960.69991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854960.70021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854960.70041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854960.70054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854960.70063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854960.70088: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854960.70092: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.70094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.70164: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854960.70171: Set connection var ansible_timeout to 10 15406 1726854960.70173: Set connection var ansible_connection to ssh 15406 1726854960.70178: Set connection var ansible_shell_type to sh 15406 1726854960.70182: Set connection var ansible_shell_executable to /bin/sh 15406 1726854960.70190: Set connection var ansible_pipelining to False 15406 1726854960.70212: variable 'ansible_shell_executable' from source: unknown 15406 1726854960.70216: variable 'ansible_connection' from source: unknown 15406 1726854960.70219: variable 'ansible_module_compression' from source: unknown 15406 1726854960.70221: variable 'ansible_shell_type' from source: unknown 15406 1726854960.70224: variable 'ansible_shell_executable' from source: unknown 15406 1726854960.70226: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854960.70229: variable 'ansible_pipelining' from source: unknown 15406 1726854960.70231: variable 'ansible_timeout' from source: unknown 15406 1726854960.70235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854960.70371: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854960.70377: variable 'omit' from source: magic vars 15406 1726854960.70382: starting attempt loop 15406 1726854960.70384: running the handler 15406 1726854960.70400: variable 'ansible_facts' from source: unknown 15406 1726854960.70417: _low_level_execute_command(): starting 15406 1726854960.70424: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854960.70930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.70934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.70940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.70994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.71002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.71005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.71079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.72815: stdout chunk (state=3): >>>/root <<< 15406 1726854960.72909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.72913: stdout chunk (state=3): >>><<< 15406 1726854960.72916: stderr chunk (state=3): >>><<< 15406 1726854960.73023: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.73028: _low_level_execute_command(): starting 15406 1726854960.73032: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694 `" && echo ansible-tmp-1726854960.7294059-16734-270717519996694="` echo /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694 `" ) && sleep 0' 15406 1726854960.73524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.73557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.73561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854960.73570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854960.73572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.73620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.73623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.73704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.75591: stdout chunk (state=3): >>>ansible-tmp-1726854960.7294059-16734-270717519996694=/root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694 <<< 15406 1726854960.75779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.75782: stdout chunk (state=3): >>><<< 15406 1726854960.75785: stderr chunk (state=3): >>><<< 15406 1726854960.75805: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854960.7294059-16734-270717519996694=/root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.75848: variable 'ansible_module_compression' from source: unknown 15406 1726854960.75875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854960.75926: variable 'ansible_facts' from source: unknown 15406 1726854960.76053: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py 15406 1726854960.76156: Sending initial data 15406 1726854960.76159: Sent initial data (154 bytes) 15406 1726854960.76605: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.76608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.76611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.76613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.76665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854960.76672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.76740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.78301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854960.78404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854960.78471: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpjq5cwy3p /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py <<< 15406 1726854960.78477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py" <<< 15406 1726854960.78538: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpjq5cwy3p" to remote "/root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py" <<< 15406 1726854960.80000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.80049: stderr chunk (state=3): >>><<< 15406 1726854960.80053: stdout chunk (state=3): >>><<< 15406 1726854960.80198: done transferring module to remote 15406 1726854960.80202: _low_level_execute_command(): starting 15406 1726854960.80205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/ /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py && sleep 0' 15406 1726854960.80912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854960.81003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.81051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.81074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.81141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854960.83023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854960.83027: stderr chunk (state=3): >>><<< 15406 1726854960.83029: stdout chunk (state=3): >>><<< 15406 1726854960.83064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854960.83067: _low_level_execute_command(): starting 15406 1726854960.83070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/AnsiballZ_setup.py && sleep 0' 15406 1726854960.83561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854960.83565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.83576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854960.83635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854960.83642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854960.83714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854961.46542: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795065856, "block_size": 4096, "block_total": 65519099, "block_available": 63914811, "block_used": 1604288, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "01", "epoch": "1726854961", "epoch_int": "1726854961", "date": "2024-09-20", "time": "13:56:01", "iso8601_micro": "2024-09-20T17:56:01.423331Z", "iso8601": "2024-09-20T17:56:01Z", "iso8601_basic": "20240920T135601423331", "iso8601_basic_short": "20240920T135601", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.35791015625, "5m": 0.34814453125, "15m": 0.177734375}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854961.48426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854961.48431: stdout chunk (state=3): >>><<< 15406 1726854961.48434: stderr chunk (state=3): >>><<< 15406 1726854961.48437: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795065856, "block_size": 4096, "block_total": 65519099, "block_available": 63914811, "block_used": 1604288, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "01", "epoch": "1726854961", "epoch_int": "1726854961", "date": "2024-09-20", "time": "13:56:01", "iso8601_micro": "2024-09-20T17:56:01.423331Z", "iso8601": "2024-09-20T17:56:01Z", "iso8601_basic": "20240920T135601423331", "iso8601_basic_short": "20240920T135601", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.35791015625, "5m": 0.34814453125, "15m": 0.177734375}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854961.50308: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854961.50313: _low_level_execute_command(): starting 15406 1726854961.50315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854960.7294059-16734-270717519996694/ > /dev/null 2>&1 && sleep 0' 15406 1726854961.51911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854961.52024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854961.52043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854961.52257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854961.54112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854961.54192: stderr chunk (state=3): >>><<< 15406 1726854961.54230: stdout chunk (state=3): >>><<< 15406 1726854961.54295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854961.54338: handler run complete 15406 1726854961.54620: variable 'ansible_facts' from source: unknown 15406 1726854961.55002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.56193: variable 'ansible_facts' from source: unknown 15406 1726854961.56392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.56395: attempt loop complete, returning result 15406 1726854961.56402: _execute() done 15406 1726854961.56406: dumping result to json 15406 1726854961.56408: done dumping result, returning 15406 1726854961.56595: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-000000000382] 15406 1726854961.56606: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000382 ok: [managed_node2] 15406 1726854961.58124: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000382 15406 1726854961.58128: WORKER PROCESS EXITING 15406 1726854961.58205: no more pending results, returning what we have 15406 1726854961.58208: results queue empty 15406 1726854961.58209: checking for any_errors_fatal 15406 1726854961.58210: done checking for any_errors_fatal 15406 1726854961.58211: checking for max_fail_percentage 15406 1726854961.58212: done checking for max_fail_percentage 15406 1726854961.58213: checking to see if all hosts have failed and the running result is not ok 15406 1726854961.58214: done checking to see if all hosts have failed 15406 1726854961.58215: getting the remaining hosts for this loop 15406 1726854961.58216: done getting the remaining hosts for this loop 15406 1726854961.58220: getting the next task for host managed_node2 15406 1726854961.58224: done getting next task for host managed_node2 15406 1726854961.58339: ^ task is: TASK: meta (flush_handlers) 15406 1726854961.58342: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854961.58347: getting variables 15406 1726854961.58348: in VariableManager get_vars() 15406 1726854961.58371: Calling all_inventory to load vars for managed_node2 15406 1726854961.58373: Calling groups_inventory to load vars for managed_node2 15406 1726854961.58377: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854961.58389: Calling all_plugins_play to load vars for managed_node2 15406 1726854961.58393: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854961.58397: Calling groups_plugins_play to load vars for managed_node2 15406 1726854961.61538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.64658: done with get_vars() 15406 1726854961.64807: done getting variables 15406 1726854961.64874: in VariableManager get_vars() 15406 1726854961.64996: Calling all_inventory to load vars for managed_node2 15406 1726854961.65002: Calling groups_inventory to load vars for managed_node2 15406 1726854961.65006: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854961.65011: Calling all_plugins_play to load vars for managed_node2 15406 1726854961.65013: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854961.65016: Calling groups_plugins_play to load vars for managed_node2 15406 1726854961.67586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.69714: done with get_vars() 15406 1726854961.69746: done queuing things up, now waiting for results queue to drain 15406 1726854961.69749: results queue empty 15406 1726854961.69750: checking for any_errors_fatal 15406 1726854961.69754: done checking for any_errors_fatal 15406 1726854961.69754: checking for max_fail_percentage 15406 1726854961.69755: done checking for max_fail_percentage 15406 1726854961.69756: checking to see if all hosts have failed and the running result is not ok 15406 1726854961.69757: done checking to see if all hosts have failed 15406 1726854961.69763: getting the remaining hosts for this loop 15406 1726854961.69764: done getting the remaining hosts for this loop 15406 1726854961.69767: getting the next task for host managed_node2 15406 1726854961.69771: done getting next task for host managed_node2 15406 1726854961.69774: ^ task is: TASK: Include the task 'delete_interface.yml' 15406 1726854961.69776: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854961.69778: getting variables 15406 1726854961.69779: in VariableManager get_vars() 15406 1726854961.69797: Calling all_inventory to load vars for managed_node2 15406 1726854961.69802: Calling groups_inventory to load vars for managed_node2 15406 1726854961.69805: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854961.69811: Calling all_plugins_play to load vars for managed_node2 15406 1726854961.69813: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854961.69816: Calling groups_plugins_play to load vars for managed_node2 15406 1726854961.71065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.72778: done with get_vars() 15406 1726854961.72804: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 13:56:01 -0400 (0:00:01.043) 0:00:29.551 ****** 15406 1726854961.72884: entering _queue_task() for managed_node2/include_tasks 15406 1726854961.73860: worker is 1 (out of 1 available) 15406 1726854961.73875: exiting _queue_task() for managed_node2/include_tasks 15406 1726854961.73993: done queuing things up, now waiting for results queue to drain 15406 1726854961.73995: waiting for pending results... 15406 1726854961.74677: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 15406 1726854961.75295: in run() - task 0affcc66-ac2b-3c83-32d3-000000000052 15406 1726854961.75299: variable 'ansible_search_path' from source: unknown 15406 1726854961.75303: calling self._execute() 15406 1726854961.75365: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854961.75895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854961.75900: variable 'omit' from source: magic vars 15406 1726854961.77092: variable 'ansible_distribution_major_version' from source: facts 15406 1726854961.77117: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854961.77134: _execute() done 15406 1726854961.77145: dumping result to json 15406 1726854961.77159: done dumping result, returning 15406 1726854961.77172: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [0affcc66-ac2b-3c83-32d3-000000000052] 15406 1726854961.77192: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000052 15406 1726854961.77363: no more pending results, returning what we have 15406 1726854961.77378: in VariableManager get_vars() 15406 1726854961.77440: Calling all_inventory to load vars for managed_node2 15406 1726854961.77443: Calling groups_inventory to load vars for managed_node2 15406 1726854961.77447: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854961.77462: Calling all_plugins_play to load vars for managed_node2 15406 1726854961.77468: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854961.77477: Calling groups_plugins_play to load vars for managed_node2 15406 1726854961.78497: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000052 15406 1726854961.78503: WORKER PROCESS EXITING 15406 1726854961.82230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.85540: done with get_vars() 15406 1726854961.85567: variable 'ansible_search_path' from source: unknown 15406 1726854961.85584: we have included files to process 15406 1726854961.85585: generating all_blocks data 15406 1726854961.85586: done generating all_blocks data 15406 1726854961.85589: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15406 1726854961.85590: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15406 1726854961.85593: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15406 1726854961.85841: done processing included file 15406 1726854961.85843: iterating over new_blocks loaded from include file 15406 1726854961.85845: in VariableManager get_vars() 15406 1726854961.85858: done with get_vars() 15406 1726854961.85860: filtering new block on tags 15406 1726854961.85889: done filtering new block on tags 15406 1726854961.85892: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 15406 1726854961.85897: extending task lists for all hosts with included blocks 15406 1726854961.85930: done extending task lists 15406 1726854961.85937: done processing included files 15406 1726854961.85938: results queue empty 15406 1726854961.85939: checking for any_errors_fatal 15406 1726854961.85941: done checking for any_errors_fatal 15406 1726854961.85941: checking for max_fail_percentage 15406 1726854961.85942: done checking for max_fail_percentage 15406 1726854961.85943: checking to see if all hosts have failed and the running result is not ok 15406 1726854961.85944: done checking to see if all hosts have failed 15406 1726854961.85945: getting the remaining hosts for this loop 15406 1726854961.85946: done getting the remaining hosts for this loop 15406 1726854961.85948: getting the next task for host managed_node2 15406 1726854961.85952: done getting next task for host managed_node2 15406 1726854961.85954: ^ task is: TASK: Remove test interface if necessary 15406 1726854961.85957: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854961.85959: getting variables 15406 1726854961.85960: in VariableManager get_vars() 15406 1726854961.85969: Calling all_inventory to load vars for managed_node2 15406 1726854961.85972: Calling groups_inventory to load vars for managed_node2 15406 1726854961.85974: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854961.85980: Calling all_plugins_play to load vars for managed_node2 15406 1726854961.85982: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854961.85985: Calling groups_plugins_play to load vars for managed_node2 15406 1726854961.87574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854961.90259: done with get_vars() 15406 1726854961.90294: done getting variables 15406 1726854961.90353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:56:01 -0400 (0:00:00.174) 0:00:29.726 ****** 15406 1726854961.90385: entering _queue_task() for managed_node2/command 15406 1726854961.90892: worker is 1 (out of 1 available) 15406 1726854961.90907: exiting _queue_task() for managed_node2/command 15406 1726854961.90920: done queuing things up, now waiting for results queue to drain 15406 1726854961.90921: waiting for pending results... 15406 1726854961.91128: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 15406 1726854961.91255: in run() - task 0affcc66-ac2b-3c83-32d3-000000000393 15406 1726854961.91306: variable 'ansible_search_path' from source: unknown 15406 1726854961.91311: variable 'ansible_search_path' from source: unknown 15406 1726854961.91374: calling self._execute() 15406 1726854961.91447: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854961.91459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854961.91480: variable 'omit' from source: magic vars 15406 1726854961.92394: variable 'ansible_distribution_major_version' from source: facts 15406 1726854961.92400: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854961.92403: variable 'omit' from source: magic vars 15406 1726854961.92406: variable 'omit' from source: magic vars 15406 1726854961.92722: variable 'interface' from source: set_fact 15406 1726854961.92726: variable 'omit' from source: magic vars 15406 1726854961.92831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854961.92895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854961.92947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854961.92968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854961.92985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854961.93078: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854961.93090: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854961.93102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854961.93357: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854961.93439: Set connection var ansible_timeout to 10 15406 1726854961.93442: Set connection var ansible_connection to ssh 15406 1726854961.93444: Set connection var ansible_shell_type to sh 15406 1726854961.93447: Set connection var ansible_shell_executable to /bin/sh 15406 1726854961.93459: Set connection var ansible_pipelining to False 15406 1726854961.93801: variable 'ansible_shell_executable' from source: unknown 15406 1726854961.93805: variable 'ansible_connection' from source: unknown 15406 1726854961.93808: variable 'ansible_module_compression' from source: unknown 15406 1726854961.93810: variable 'ansible_shell_type' from source: unknown 15406 1726854961.93812: variable 'ansible_shell_executable' from source: unknown 15406 1726854961.93814: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854961.93816: variable 'ansible_pipelining' from source: unknown 15406 1726854961.93818: variable 'ansible_timeout' from source: unknown 15406 1726854961.93820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854961.93913: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854961.93929: variable 'omit' from source: magic vars 15406 1726854961.93939: starting attempt loop 15406 1726854961.93945: running the handler 15406 1726854961.93962: _low_level_execute_command(): starting 15406 1726854961.94020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854961.95230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854961.95283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854961.95306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854961.95337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854961.95468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854961.97197: stdout chunk (state=3): >>>/root <<< 15406 1726854961.97384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854961.97390: stdout chunk (state=3): >>><<< 15406 1726854961.97393: stderr chunk (state=3): >>><<< 15406 1726854961.97693: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854961.97697: _low_level_execute_command(): starting 15406 1726854961.97702: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221 `" && echo ansible-tmp-1726854961.9759107-16791-134878068016221="` echo /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221 `" ) && sleep 0' 15406 1726854961.98914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854961.99562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854961.99621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.01544: stdout chunk (state=3): >>>ansible-tmp-1726854961.9759107-16791-134878068016221=/root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221 <<< 15406 1726854962.01657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.01703: stderr chunk (state=3): >>><<< 15406 1726854962.01741: stdout chunk (state=3): >>><<< 15406 1726854962.01767: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854961.9759107-16791-134878068016221=/root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854962.01831: variable 'ansible_module_compression' from source: unknown 15406 1726854962.02101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15406 1726854962.02109: variable 'ansible_facts' from source: unknown 15406 1726854962.02302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py 15406 1726854962.02722: Sending initial data 15406 1726854962.02732: Sent initial data (156 bytes) 15406 1726854962.04476: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.04671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.04934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.06659: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854962.06713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854962.06856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpltmw92nf /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py <<< 15406 1726854962.06860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py" <<< 15406 1726854962.06958: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpltmw92nf" to remote "/root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py" <<< 15406 1726854962.08688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.08909: stderr chunk (state=3): >>><<< 15406 1726854962.08915: stdout chunk (state=3): >>><<< 15406 1726854962.08917: done transferring module to remote 15406 1726854962.08920: _low_level_execute_command(): starting 15406 1726854962.08922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/ /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py && sleep 0' 15406 1726854962.09820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854962.09881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854962.09899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.09970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854962.09998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.10030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.10217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.12136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.12140: stdout chunk (state=3): >>><<< 15406 1726854962.12142: stderr chunk (state=3): >>><<< 15406 1726854962.12263: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854962.12273: _low_level_execute_command(): starting 15406 1726854962.12276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/AnsiballZ_command.py && sleep 0' 15406 1726854962.13643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854962.13647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.13722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854962.13725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.13849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.14055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.29793: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 13:56:02.287981", "end": "2024-09-20 13:56:02.295618", "delta": "0:00:00.007637", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15406 1726854962.31294: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.178 closed. <<< 15406 1726854962.31298: stderr chunk (state=3): >>><<< 15406 1726854962.31300: stdout chunk (state=3): >>><<< 15406 1726854962.31509: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 13:56:02.287981", "end": "2024-09-20 13:56:02.295618", "delta": "0:00:00.007637", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.178 closed. 15406 1726854962.31513: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854962.31515: _low_level_execute_command(): starting 15406 1726854962.31518: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854961.9759107-16791-134878068016221/ > /dev/null 2>&1 && sleep 0' 15406 1726854962.33005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.33224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.33250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.33343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.35268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.35304: stdout chunk (state=3): >>><<< 15406 1726854962.35321: stderr chunk (state=3): >>><<< 15406 1726854962.35355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854962.35494: handler run complete 15406 1726854962.35498: Evaluated conditional (False): False 15406 1726854962.35500: attempt loop complete, returning result 15406 1726854962.35502: _execute() done 15406 1726854962.35504: dumping result to json 15406 1726854962.35506: done dumping result, returning 15406 1726854962.35509: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affcc66-ac2b-3c83-32d3-000000000393] 15406 1726854962.35510: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000393 15406 1726854962.35581: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000393 15406 1726854962.35585: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007637", "end": "2024-09-20 13:56:02.295618", "rc": 1, "start": "2024-09-20 13:56:02.287981" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15406 1726854962.35657: no more pending results, returning what we have 15406 1726854962.35660: results queue empty 15406 1726854962.35662: checking for any_errors_fatal 15406 1726854962.35663: done checking for any_errors_fatal 15406 1726854962.35664: checking for max_fail_percentage 15406 1726854962.35666: done checking for max_fail_percentage 15406 1726854962.35667: checking to see if all hosts have failed and the running result is not ok 15406 1726854962.35668: done checking to see if all hosts have failed 15406 1726854962.35669: getting the remaining hosts for this loop 15406 1726854962.35670: done getting the remaining hosts for this loop 15406 1726854962.35674: getting the next task for host managed_node2 15406 1726854962.35683: done getting next task for host managed_node2 15406 1726854962.35686: ^ task is: TASK: meta (flush_handlers) 15406 1726854962.35803: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854962.35819: getting variables 15406 1726854962.35823: in VariableManager get_vars() 15406 1726854962.35859: Calling all_inventory to load vars for managed_node2 15406 1726854962.35865: Calling groups_inventory to load vars for managed_node2 15406 1726854962.35870: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854962.35885: Calling all_plugins_play to load vars for managed_node2 15406 1726854962.35956: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854962.35961: Calling groups_plugins_play to load vars for managed_node2 15406 1726854962.39961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854962.42060: done with get_vars() 15406 1726854962.42118: done getting variables 15406 1726854962.42361: in VariableManager get_vars() 15406 1726854962.42372: Calling all_inventory to load vars for managed_node2 15406 1726854962.42375: Calling groups_inventory to load vars for managed_node2 15406 1726854962.42377: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854962.42382: Calling all_plugins_play to load vars for managed_node2 15406 1726854962.42385: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854962.42408: Calling groups_plugins_play to load vars for managed_node2 15406 1726854962.45844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854962.49408: done with get_vars() 15406 1726854962.49449: done queuing things up, now waiting for results queue to drain 15406 1726854962.49451: results queue empty 15406 1726854962.49452: checking for any_errors_fatal 15406 1726854962.49456: done checking for any_errors_fatal 15406 1726854962.49572: checking for max_fail_percentage 15406 1726854962.49574: done checking for max_fail_percentage 15406 1726854962.49578: checking to see if all hosts have failed and the running result is not ok 15406 1726854962.49579: done checking to see if all hosts have failed 15406 1726854962.49580: getting the remaining hosts for this loop 15406 1726854962.49581: done getting the remaining hosts for this loop 15406 1726854962.49584: getting the next task for host managed_node2 15406 1726854962.49590: done getting next task for host managed_node2 15406 1726854962.49592: ^ task is: TASK: meta (flush_handlers) 15406 1726854962.49593: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854962.49596: getting variables 15406 1726854962.49597: in VariableManager get_vars() 15406 1726854962.49608: Calling all_inventory to load vars for managed_node2 15406 1726854962.49611: Calling groups_inventory to load vars for managed_node2 15406 1726854962.49614: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854962.49619: Calling all_plugins_play to load vars for managed_node2 15406 1726854962.49621: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854962.49624: Calling groups_plugins_play to load vars for managed_node2 15406 1726854962.52255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854962.53974: done with get_vars() 15406 1726854962.54064: done getting variables 15406 1726854962.54248: in VariableManager get_vars() 15406 1726854962.54258: Calling all_inventory to load vars for managed_node2 15406 1726854962.54260: Calling groups_inventory to load vars for managed_node2 15406 1726854962.54263: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854962.54268: Calling all_plugins_play to load vars for managed_node2 15406 1726854962.54270: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854962.54272: Calling groups_plugins_play to load vars for managed_node2 15406 1726854962.56404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854962.58509: done with get_vars() 15406 1726854962.58544: done queuing things up, now waiting for results queue to drain 15406 1726854962.58546: results queue empty 15406 1726854962.58547: checking for any_errors_fatal 15406 1726854962.58549: done checking for any_errors_fatal 15406 1726854962.58549: checking for max_fail_percentage 15406 1726854962.58551: done checking for max_fail_percentage 15406 1726854962.58551: checking to see if all hosts have failed and the running result is not ok 15406 1726854962.58552: done checking to see if all hosts have failed 15406 1726854962.58553: getting the remaining hosts for this loop 15406 1726854962.58554: done getting the remaining hosts for this loop 15406 1726854962.58557: getting the next task for host managed_node2 15406 1726854962.58560: done getting next task for host managed_node2 15406 1726854962.58560: ^ task is: None 15406 1726854962.58562: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854962.58563: done queuing things up, now waiting for results queue to drain 15406 1726854962.58564: results queue empty 15406 1726854962.58565: checking for any_errors_fatal 15406 1726854962.58565: done checking for any_errors_fatal 15406 1726854962.58566: checking for max_fail_percentage 15406 1726854962.58567: done checking for max_fail_percentage 15406 1726854962.58567: checking to see if all hosts have failed and the running result is not ok 15406 1726854962.58575: done checking to see if all hosts have failed 15406 1726854962.58576: getting the next task for host managed_node2 15406 1726854962.58578: done getting next task for host managed_node2 15406 1726854962.58579: ^ task is: None 15406 1726854962.58580: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854962.58814: in VariableManager get_vars() 15406 1726854962.58840: done with get_vars() 15406 1726854962.58845: in VariableManager get_vars() 15406 1726854962.58856: done with get_vars() 15406 1726854962.58859: variable 'omit' from source: magic vars 15406 1726854962.59084: variable 'profile' from source: play vars 15406 1726854962.59302: in VariableManager get_vars() 15406 1726854962.59315: done with get_vars() 15406 1726854962.59333: variable 'omit' from source: magic vars 15406 1726854962.59514: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15406 1726854962.61082: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854962.61502: getting the remaining hosts for this loop 15406 1726854962.61504: done getting the remaining hosts for this loop 15406 1726854962.61506: getting the next task for host managed_node2 15406 1726854962.61509: done getting next task for host managed_node2 15406 1726854962.61511: ^ task is: TASK: Gathering Facts 15406 1726854962.61513: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854962.61591: getting variables 15406 1726854962.61592: in VariableManager get_vars() 15406 1726854962.61606: Calling all_inventory to load vars for managed_node2 15406 1726854962.61608: Calling groups_inventory to load vars for managed_node2 15406 1726854962.61610: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854962.61616: Calling all_plugins_play to load vars for managed_node2 15406 1726854962.61618: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854962.61621: Calling groups_plugins_play to load vars for managed_node2 15406 1726854962.65412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854962.69499: done with get_vars() 15406 1726854962.69526: done getting variables 15406 1726854962.69798: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:56:02 -0400 (0:00:00.794) 0:00:30.520 ****** 15406 1726854962.69827: entering _queue_task() for managed_node2/gather_facts 15406 1726854962.70565: worker is 1 (out of 1 available) 15406 1726854962.70575: exiting _queue_task() for managed_node2/gather_facts 15406 1726854962.70590: done queuing things up, now waiting for results queue to drain 15406 1726854962.70591: waiting for pending results... 15406 1726854962.71069: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854962.71178: in run() - task 0affcc66-ac2b-3c83-32d3-0000000003a1 15406 1726854962.71594: variable 'ansible_search_path' from source: unknown 15406 1726854962.71598: calling self._execute() 15406 1726854962.71604: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854962.71607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854962.71610: variable 'omit' from source: magic vars 15406 1726854962.72324: variable 'ansible_distribution_major_version' from source: facts 15406 1726854962.72509: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854962.72520: variable 'omit' from source: magic vars 15406 1726854962.72560: variable 'omit' from source: magic vars 15406 1726854962.72606: variable 'omit' from source: magic vars 15406 1726854962.72652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854962.72695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854962.72918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854962.73394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854962.73398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854962.73404: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854962.73408: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854962.73411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854962.73413: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854962.73415: Set connection var ansible_timeout to 10 15406 1726854962.73418: Set connection var ansible_connection to ssh 15406 1726854962.73420: Set connection var ansible_shell_type to sh 15406 1726854962.73423: Set connection var ansible_shell_executable to /bin/sh 15406 1726854962.73425: Set connection var ansible_pipelining to False 15406 1726854962.73696: variable 'ansible_shell_executable' from source: unknown 15406 1726854962.73709: variable 'ansible_connection' from source: unknown 15406 1726854962.73717: variable 'ansible_module_compression' from source: unknown 15406 1726854962.73725: variable 'ansible_shell_type' from source: unknown 15406 1726854962.73732: variable 'ansible_shell_executable' from source: unknown 15406 1726854962.73738: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854962.73746: variable 'ansible_pipelining' from source: unknown 15406 1726854962.73752: variable 'ansible_timeout' from source: unknown 15406 1726854962.73758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854962.74064: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854962.74209: variable 'omit' from source: magic vars 15406 1726854962.74293: starting attempt loop 15406 1726854962.74296: running the handler 15406 1726854962.74298: variable 'ansible_facts' from source: unknown 15406 1726854962.74302: _low_level_execute_command(): starting 15406 1726854962.74304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854962.75744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854962.75748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.75751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854962.75753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.76005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.76201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.77924: stdout chunk (state=3): >>>/root <<< 15406 1726854962.78020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.78054: stderr chunk (state=3): >>><<< 15406 1726854962.78063: stdout chunk (state=3): >>><<< 15406 1726854962.78094: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854962.78380: _low_level_execute_command(): starting 15406 1726854962.78385: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197 `" && echo ansible-tmp-1726854962.7829404-16819-125025691849197="` echo /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197 `" ) && sleep 0' 15406 1726854962.79444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854962.79459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854962.79471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854962.79491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854962.79509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854962.79521: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854962.79544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.79564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854962.79869: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.79885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.80128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.82056: stdout chunk (state=3): >>>ansible-tmp-1726854962.7829404-16819-125025691849197=/root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197 <<< 15406 1726854962.82206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.82231: stderr chunk (state=3): >>><<< 15406 1726854962.82244: stdout chunk (state=3): >>><<< 15406 1726854962.82270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854962.7829404-16819-125025691849197=/root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854962.82329: variable 'ansible_module_compression' from source: unknown 15406 1726854962.82399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854962.82478: variable 'ansible_facts' from source: unknown 15406 1726854962.82717: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py 15406 1726854962.82917: Sending initial data 15406 1726854962.82929: Sent initial data (154 bytes) 15406 1726854962.83578: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854962.83833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.83957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.84111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.85643: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854962.85792: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854962.85843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py" <<< 15406 1726854962.85856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmptoojvice /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py <<< 15406 1726854962.85871: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmptoojvice" to remote "/root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py" <<< 15406 1726854962.89482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.89490: stderr chunk (state=3): >>><<< 15406 1726854962.89493: stdout chunk (state=3): >>><<< 15406 1726854962.89495: done transferring module to remote 15406 1726854962.89497: _low_level_execute_command(): starting 15406 1726854962.89504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/ /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py && sleep 0' 15406 1726854962.90716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854962.90826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854962.90842: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854962.90857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.91034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.91057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854962.91075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.91096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.91250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854962.93064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854962.93074: stdout chunk (state=3): >>><<< 15406 1726854962.93090: stderr chunk (state=3): >>><<< 15406 1726854962.93127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854962.93136: _low_level_execute_command(): starting 15406 1726854962.93145: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/AnsiballZ_setup.py && sleep 0' 15406 1726854962.93740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854962.93754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854962.93770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854962.93791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854962.93861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854962.93918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854962.93937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854962.93974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854962.94093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854963.57803: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "03", "epoch": "1726854963", "epoch_int": "1726854963", "date": "2024-09-20", "time": "13:56:03", "iso8601_micro": "2024-09-20T17:56:03.205225Z", "iso8601": "2024-09-20T17:56:03Z", "iso8601_basic": "20240920T135603205225", "iso8601_basic_short": "20240920T135603", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.3291015625, "5m": 0.34228515625, "15m": 0.1767578125}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2966, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 565, "free": 2966}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "<<< 15406 1726854963.57866: stdout chunk (state=3): >>>ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 746, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795065856, "block_size": 4096, "block_total": 65519099, "block_available": 63914811, "block_used": 1604288, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854963.59603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854963.59615: stdout chunk (state=3): >>><<< 15406 1726854963.59627: stderr chunk (state=3): >>><<< 15406 1726854963.59667: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "03", "epoch": "1726854963", "epoch_int": "1726854963", "date": "2024-09-20", "time": "13:56:03", "iso8601_micro": "2024-09-20T17:56:03.205225Z", "iso8601": "2024-09-20T17:56:03Z", "iso8601_basic": "20240920T135603205225", "iso8601_basic_short": "20240920T135603", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.3291015625, "5m": 0.34228515625, "15m": 0.1767578125}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2966, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 565, "free": 2966}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 746, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795065856, "block_size": 4096, "block_total": 65519099, "block_available": 63914811, "block_used": 1604288, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854963.60029: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854963.60058: _low_level_execute_command(): starting 15406 1726854963.60106: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854962.7829404-16819-125025691849197/ > /dev/null 2>&1 && sleep 0' 15406 1726854963.60804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854963.60830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854963.60848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854963.60892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854963.60965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854963.62836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854963.62961: stdout chunk (state=3): >>><<< 15406 1726854963.62975: stderr chunk (state=3): >>><<< 15406 1726854963.62999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854963.63016: handler run complete 15406 1726854963.63172: variable 'ansible_facts' from source: unknown 15406 1726854963.63342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.63717: variable 'ansible_facts' from source: unknown 15406 1726854963.63826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.63992: attempt loop complete, returning result 15406 1726854963.64071: _execute() done 15406 1726854963.64074: dumping result to json 15406 1726854963.64076: done dumping result, returning 15406 1726854963.64078: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-0000000003a1] 15406 1726854963.64086: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003a1 ok: [managed_node2] 15406 1726854963.65197: no more pending results, returning what we have 15406 1726854963.65201: results queue empty 15406 1726854963.65202: checking for any_errors_fatal 15406 1726854963.65203: done checking for any_errors_fatal 15406 1726854963.65204: checking for max_fail_percentage 15406 1726854963.65205: done checking for max_fail_percentage 15406 1726854963.65208: checking to see if all hosts have failed and the running result is not ok 15406 1726854963.65209: done checking to see if all hosts have failed 15406 1726854963.65209: getting the remaining hosts for this loop 15406 1726854963.65210: done getting the remaining hosts for this loop 15406 1726854963.65217: getting the next task for host managed_node2 15406 1726854963.65222: done getting next task for host managed_node2 15406 1726854963.65224: ^ task is: TASK: meta (flush_handlers) 15406 1726854963.65226: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854963.65231: getting variables 15406 1726854963.65232: in VariableManager get_vars() 15406 1726854963.65278: Calling all_inventory to load vars for managed_node2 15406 1726854963.65281: Calling groups_inventory to load vars for managed_node2 15406 1726854963.65284: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854963.65291: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003a1 15406 1726854963.65294: WORKER PROCESS EXITING 15406 1726854963.65306: Calling all_plugins_play to load vars for managed_node2 15406 1726854963.65313: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854963.65318: Calling groups_plugins_play to load vars for managed_node2 15406 1726854963.66785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.69580: done with get_vars() 15406 1726854963.69613: done getting variables 15406 1726854963.69697: in VariableManager get_vars() 15406 1726854963.69734: Calling all_inventory to load vars for managed_node2 15406 1726854963.69753: Calling groups_inventory to load vars for managed_node2 15406 1726854963.69756: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854963.69763: Calling all_plugins_play to load vars for managed_node2 15406 1726854963.69766: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854963.69769: Calling groups_plugins_play to load vars for managed_node2 15406 1726854963.71368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.75955: done with get_vars() 15406 1726854963.75993: done queuing things up, now waiting for results queue to drain 15406 1726854963.76000: results queue empty 15406 1726854963.76001: checking for any_errors_fatal 15406 1726854963.76005: done checking for any_errors_fatal 15406 1726854963.76006: checking for max_fail_percentage 15406 1726854963.76008: done checking for max_fail_percentage 15406 1726854963.76009: checking to see if all hosts have failed and the running result is not ok 15406 1726854963.76013: done checking to see if all hosts have failed 15406 1726854963.76014: getting the remaining hosts for this loop 15406 1726854963.76015: done getting the remaining hosts for this loop 15406 1726854963.76019: getting the next task for host managed_node2 15406 1726854963.76023: done getting next task for host managed_node2 15406 1726854963.76027: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15406 1726854963.76029: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854963.76038: getting variables 15406 1726854963.76039: in VariableManager get_vars() 15406 1726854963.76098: Calling all_inventory to load vars for managed_node2 15406 1726854963.76101: Calling groups_inventory to load vars for managed_node2 15406 1726854963.76103: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854963.76109: Calling all_plugins_play to load vars for managed_node2 15406 1726854963.76111: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854963.76114: Calling groups_plugins_play to load vars for managed_node2 15406 1726854963.77909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.79699: done with get_vars() 15406 1726854963.79718: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:56:03 -0400 (0:00:01.099) 0:00:31.620 ****** 15406 1726854963.79800: entering _queue_task() for managed_node2/include_tasks 15406 1726854963.80382: worker is 1 (out of 1 available) 15406 1726854963.80411: exiting _queue_task() for managed_node2/include_tasks 15406 1726854963.80423: done queuing things up, now waiting for results queue to drain 15406 1726854963.80424: waiting for pending results... 15406 1726854963.80665: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15406 1726854963.80819: in run() - task 0affcc66-ac2b-3c83-32d3-00000000005a 15406 1726854963.80907: variable 'ansible_search_path' from source: unknown 15406 1726854963.80911: variable 'ansible_search_path' from source: unknown 15406 1726854963.80922: calling self._execute() 15406 1726854963.81034: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854963.81045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854963.81070: variable 'omit' from source: magic vars 15406 1726854963.81481: variable 'ansible_distribution_major_version' from source: facts 15406 1726854963.81592: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854963.81607: _execute() done 15406 1726854963.81610: dumping result to json 15406 1726854963.81613: done dumping result, returning 15406 1726854963.81616: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-3c83-32d3-00000000005a] 15406 1726854963.81618: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005a 15406 1726854963.81701: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005a 15406 1726854963.81705: WORKER PROCESS EXITING 15406 1726854963.81826: no more pending results, returning what we have 15406 1726854963.81840: in VariableManager get_vars() 15406 1726854963.82037: Calling all_inventory to load vars for managed_node2 15406 1726854963.82041: Calling groups_inventory to load vars for managed_node2 15406 1726854963.82044: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854963.82055: Calling all_plugins_play to load vars for managed_node2 15406 1726854963.82058: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854963.82061: Calling groups_plugins_play to load vars for managed_node2 15406 1726854963.83903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.86000: done with get_vars() 15406 1726854963.86029: variable 'ansible_search_path' from source: unknown 15406 1726854963.86030: variable 'ansible_search_path' from source: unknown 15406 1726854963.86062: we have included files to process 15406 1726854963.86063: generating all_blocks data 15406 1726854963.86065: done generating all_blocks data 15406 1726854963.86066: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854963.86067: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854963.86069: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15406 1726854963.87097: done processing included file 15406 1726854963.87109: iterating over new_blocks loaded from include file 15406 1726854963.87111: in VariableManager get_vars() 15406 1726854963.87137: done with get_vars() 15406 1726854963.87139: filtering new block on tags 15406 1726854963.87165: done filtering new block on tags 15406 1726854963.87169: in VariableManager get_vars() 15406 1726854963.87223: done with get_vars() 15406 1726854963.87226: filtering new block on tags 15406 1726854963.87244: done filtering new block on tags 15406 1726854963.87246: in VariableManager get_vars() 15406 1726854963.87264: done with get_vars() 15406 1726854963.87266: filtering new block on tags 15406 1726854963.87280: done filtering new block on tags 15406 1726854963.87282: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 15406 1726854963.87289: extending task lists for all hosts with included blocks 15406 1726854963.87663: done extending task lists 15406 1726854963.87665: done processing included files 15406 1726854963.87666: results queue empty 15406 1726854963.87666: checking for any_errors_fatal 15406 1726854963.87668: done checking for any_errors_fatal 15406 1726854963.87669: checking for max_fail_percentage 15406 1726854963.87669: done checking for max_fail_percentage 15406 1726854963.87670: checking to see if all hosts have failed and the running result is not ok 15406 1726854963.87671: done checking to see if all hosts have failed 15406 1726854963.87672: getting the remaining hosts for this loop 15406 1726854963.87673: done getting the remaining hosts for this loop 15406 1726854963.87675: getting the next task for host managed_node2 15406 1726854963.87678: done getting next task for host managed_node2 15406 1726854963.87681: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15406 1726854963.87683: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854963.87694: getting variables 15406 1726854963.87695: in VariableManager get_vars() 15406 1726854963.87711: Calling all_inventory to load vars for managed_node2 15406 1726854963.87713: Calling groups_inventory to load vars for managed_node2 15406 1726854963.87715: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854963.87720: Calling all_plugins_play to load vars for managed_node2 15406 1726854963.87722: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854963.87724: Calling groups_plugins_play to load vars for managed_node2 15406 1726854963.89679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854963.93237: done with get_vars() 15406 1726854963.93386: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:56:03 -0400 (0:00:00.136) 0:00:31.757 ****** 15406 1726854963.93597: entering _queue_task() for managed_node2/setup 15406 1726854963.94385: worker is 1 (out of 1 available) 15406 1726854963.94398: exiting _queue_task() for managed_node2/setup 15406 1726854963.94409: done queuing things up, now waiting for results queue to drain 15406 1726854963.94410: waiting for pending results... 15406 1726854963.95041: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15406 1726854963.95488: in run() - task 0affcc66-ac2b-3c83-32d3-0000000003e2 15406 1726854963.95493: variable 'ansible_search_path' from source: unknown 15406 1726854963.95495: variable 'ansible_search_path' from source: unknown 15406 1726854963.95596: calling self._execute() 15406 1726854963.95893: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854963.95897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854963.95902: variable 'omit' from source: magic vars 15406 1726854963.96992: variable 'ansible_distribution_major_version' from source: facts 15406 1726854963.96996: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854963.97559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854964.03068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854964.03257: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854964.03401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854964.03538: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854964.03541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854964.03681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854964.03713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854964.03810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854964.03972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854964.03976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854964.04292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854964.04297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854964.04303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854964.04307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854964.04315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854964.04832: variable '__network_required_facts' from source: role '' defaults 15406 1726854964.04841: variable 'ansible_facts' from source: unknown 15406 1726854964.06521: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15406 1726854964.06526: when evaluation is False, skipping this task 15406 1726854964.06528: _execute() done 15406 1726854964.06531: dumping result to json 15406 1726854964.06533: done dumping result, returning 15406 1726854964.06539: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-3c83-32d3-0000000003e2] 15406 1726854964.06541: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e2 15406 1726854964.06834: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e2 15406 1726854964.06838: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854964.06885: no more pending results, returning what we have 15406 1726854964.06892: results queue empty 15406 1726854964.06893: checking for any_errors_fatal 15406 1726854964.06895: done checking for any_errors_fatal 15406 1726854964.06896: checking for max_fail_percentage 15406 1726854964.06897: done checking for max_fail_percentage 15406 1726854964.06898: checking to see if all hosts have failed and the running result is not ok 15406 1726854964.06899: done checking to see if all hosts have failed 15406 1726854964.06900: getting the remaining hosts for this loop 15406 1726854964.06901: done getting the remaining hosts for this loop 15406 1726854964.06905: getting the next task for host managed_node2 15406 1726854964.06914: done getting next task for host managed_node2 15406 1726854964.06918: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15406 1726854964.06922: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854964.06935: getting variables 15406 1726854964.06937: in VariableManager get_vars() 15406 1726854964.06980: Calling all_inventory to load vars for managed_node2 15406 1726854964.06983: Calling groups_inventory to load vars for managed_node2 15406 1726854964.06986: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854964.07000: Calling all_plugins_play to load vars for managed_node2 15406 1726854964.07004: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854964.07007: Calling groups_plugins_play to load vars for managed_node2 15406 1726854964.10294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854964.13770: done with get_vars() 15406 1726854964.13803: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:56:04 -0400 (0:00:00.205) 0:00:31.962 ****** 15406 1726854964.14027: entering _queue_task() for managed_node2/stat 15406 1726854964.14924: worker is 1 (out of 1 available) 15406 1726854964.14935: exiting _queue_task() for managed_node2/stat 15406 1726854964.14947: done queuing things up, now waiting for results queue to drain 15406 1726854964.15062: waiting for pending results... 15406 1726854964.15391: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15406 1726854964.15897: in run() - task 0affcc66-ac2b-3c83-32d3-0000000003e4 15406 1726854964.15905: variable 'ansible_search_path' from source: unknown 15406 1726854964.15909: variable 'ansible_search_path' from source: unknown 15406 1726854964.15912: calling self._execute() 15406 1726854964.16040: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854964.16046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854964.16056: variable 'omit' from source: magic vars 15406 1726854964.16943: variable 'ansible_distribution_major_version' from source: facts 15406 1726854964.16955: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854964.17285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854964.17947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854964.18165: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854964.18168: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854964.18241: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854964.18446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854964.18512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854964.18561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854964.18661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854964.18864: variable '__network_is_ostree' from source: set_fact 15406 1726854964.18871: Evaluated conditional (not __network_is_ostree is defined): False 15406 1726854964.18874: when evaluation is False, skipping this task 15406 1726854964.18876: _execute() done 15406 1726854964.18879: dumping result to json 15406 1726854964.18882: done dumping result, returning 15406 1726854964.18903: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-3c83-32d3-0000000003e4] 15406 1726854964.18906: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e4 15406 1726854964.19007: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e4 15406 1726854964.19011: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15406 1726854964.19066: no more pending results, returning what we have 15406 1726854964.19070: results queue empty 15406 1726854964.19071: checking for any_errors_fatal 15406 1726854964.19080: done checking for any_errors_fatal 15406 1726854964.19080: checking for max_fail_percentage 15406 1726854964.19082: done checking for max_fail_percentage 15406 1726854964.19083: checking to see if all hosts have failed and the running result is not ok 15406 1726854964.19085: done checking to see if all hosts have failed 15406 1726854964.19086: getting the remaining hosts for this loop 15406 1726854964.19089: done getting the remaining hosts for this loop 15406 1726854964.19093: getting the next task for host managed_node2 15406 1726854964.19214: done getting next task for host managed_node2 15406 1726854964.19219: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15406 1726854964.19222: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854964.19235: getting variables 15406 1726854964.19237: in VariableManager get_vars() 15406 1726854964.19282: Calling all_inventory to load vars for managed_node2 15406 1726854964.19284: Calling groups_inventory to load vars for managed_node2 15406 1726854964.19412: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854964.19424: Calling all_plugins_play to load vars for managed_node2 15406 1726854964.19428: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854964.19432: Calling groups_plugins_play to load vars for managed_node2 15406 1726854964.24102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854964.26178: done with get_vars() 15406 1726854964.26213: done getting variables 15406 1726854964.26272: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:56:04 -0400 (0:00:00.122) 0:00:32.085 ****** 15406 1726854964.26314: entering _queue_task() for managed_node2/set_fact 15406 1726854964.26897: worker is 1 (out of 1 available) 15406 1726854964.26908: exiting _queue_task() for managed_node2/set_fact 15406 1726854964.26919: done queuing things up, now waiting for results queue to drain 15406 1726854964.26920: waiting for pending results... 15406 1726854964.27065: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15406 1726854964.27547: in run() - task 0affcc66-ac2b-3c83-32d3-0000000003e5 15406 1726854964.27551: variable 'ansible_search_path' from source: unknown 15406 1726854964.27554: variable 'ansible_search_path' from source: unknown 15406 1726854964.27557: calling self._execute() 15406 1726854964.27755: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854964.27760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854964.27767: variable 'omit' from source: magic vars 15406 1726854964.28555: variable 'ansible_distribution_major_version' from source: facts 15406 1726854964.28567: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854964.28920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854964.29322: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854964.29490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854964.29523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854964.29558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854964.29762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854964.29790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854964.29925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854964.29952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854964.30154: variable '__network_is_ostree' from source: set_fact 15406 1726854964.30161: Evaluated conditional (not __network_is_ostree is defined): False 15406 1726854964.30164: when evaluation is False, skipping this task 15406 1726854964.30167: _execute() done 15406 1726854964.30169: dumping result to json 15406 1726854964.30173: done dumping result, returning 15406 1726854964.30183: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-3c83-32d3-0000000003e5] 15406 1726854964.30189: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e5 15406 1726854964.30330: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e5 15406 1726854964.30333: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15406 1726854964.30390: no more pending results, returning what we have 15406 1726854964.30395: results queue empty 15406 1726854964.30396: checking for any_errors_fatal 15406 1726854964.30404: done checking for any_errors_fatal 15406 1726854964.30405: checking for max_fail_percentage 15406 1726854964.30406: done checking for max_fail_percentage 15406 1726854964.30408: checking to see if all hosts have failed and the running result is not ok 15406 1726854964.30409: done checking to see if all hosts have failed 15406 1726854964.30409: getting the remaining hosts for this loop 15406 1726854964.30411: done getting the remaining hosts for this loop 15406 1726854964.30414: getting the next task for host managed_node2 15406 1726854964.30425: done getting next task for host managed_node2 15406 1726854964.30428: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15406 1726854964.30432: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854964.30447: getting variables 15406 1726854964.30449: in VariableManager get_vars() 15406 1726854964.30492: Calling all_inventory to load vars for managed_node2 15406 1726854964.30495: Calling groups_inventory to load vars for managed_node2 15406 1726854964.30498: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854964.30509: Calling all_plugins_play to load vars for managed_node2 15406 1726854964.30512: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854964.30515: Calling groups_plugins_play to load vars for managed_node2 15406 1726854964.32845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854964.34471: done with get_vars() 15406 1726854964.34499: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:56:04 -0400 (0:00:00.082) 0:00:32.168 ****** 15406 1726854964.34607: entering _queue_task() for managed_node2/service_facts 15406 1726854964.34998: worker is 1 (out of 1 available) 15406 1726854964.35010: exiting _queue_task() for managed_node2/service_facts 15406 1726854964.35022: done queuing things up, now waiting for results queue to drain 15406 1726854964.35023: waiting for pending results... 15406 1726854964.35333: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 15406 1726854964.35358: in run() - task 0affcc66-ac2b-3c83-32d3-0000000003e7 15406 1726854964.35371: variable 'ansible_search_path' from source: unknown 15406 1726854964.35375: variable 'ansible_search_path' from source: unknown 15406 1726854964.35421: calling self._execute() 15406 1726854964.35514: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854964.35537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854964.35540: variable 'omit' from source: magic vars 15406 1726854964.35901: variable 'ansible_distribution_major_version' from source: facts 15406 1726854964.35971: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854964.35974: variable 'omit' from source: magic vars 15406 1726854964.35977: variable 'omit' from source: magic vars 15406 1726854964.36009: variable 'omit' from source: magic vars 15406 1726854964.36047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854964.36091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854964.36109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854964.36126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854964.36138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854964.36173: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854964.36177: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854964.36179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854964.36294: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854964.36299: Set connection var ansible_timeout to 10 15406 1726854964.36303: Set connection var ansible_connection to ssh 15406 1726854964.36305: Set connection var ansible_shell_type to sh 15406 1726854964.36307: Set connection var ansible_shell_executable to /bin/sh 15406 1726854964.36309: Set connection var ansible_pipelining to False 15406 1726854964.36393: variable 'ansible_shell_executable' from source: unknown 15406 1726854964.36404: variable 'ansible_connection' from source: unknown 15406 1726854964.36407: variable 'ansible_module_compression' from source: unknown 15406 1726854964.36410: variable 'ansible_shell_type' from source: unknown 15406 1726854964.36412: variable 'ansible_shell_executable' from source: unknown 15406 1726854964.36414: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854964.36416: variable 'ansible_pipelining' from source: unknown 15406 1726854964.36418: variable 'ansible_timeout' from source: unknown 15406 1726854964.36420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854964.36544: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854964.36554: variable 'omit' from source: magic vars 15406 1726854964.36559: starting attempt loop 15406 1726854964.36563: running the handler 15406 1726854964.36575: _low_level_execute_command(): starting 15406 1726854964.36583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854964.37380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854964.37404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854964.37418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854964.37436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854964.37543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854964.39462: stdout chunk (state=3): >>>/root <<< 15406 1726854964.39510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854964.39568: stderr chunk (state=3): >>><<< 15406 1726854964.39586: stdout chunk (state=3): >>><<< 15406 1726854964.39721: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854964.39724: _low_level_execute_command(): starting 15406 1726854964.39727: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049 `" && echo ansible-tmp-1726854964.3961298-16905-103946844997049="` echo /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049 `" ) && sleep 0' 15406 1726854964.40280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854964.40302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854964.40321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854964.40408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854964.40452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854964.40467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854964.40485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854964.40703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854964.42508: stdout chunk (state=3): >>>ansible-tmp-1726854964.3961298-16905-103946844997049=/root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049 <<< 15406 1726854964.42638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854964.42711: stderr chunk (state=3): >>><<< 15406 1726854964.42715: stdout chunk (state=3): >>><<< 15406 1726854964.42894: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854964.3961298-16905-103946844997049=/root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854964.42898: variable 'ansible_module_compression' from source: unknown 15406 1726854964.42901: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15406 1726854964.42903: variable 'ansible_facts' from source: unknown 15406 1726854964.42999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py 15406 1726854964.43324: Sending initial data 15406 1726854964.43346: Sent initial data (162 bytes) 15406 1726854964.44205: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854964.44208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854964.44210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854964.44213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854964.44216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854964.44264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854964.44294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854964.44466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854964.46061: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854964.46140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854964.46221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpum5k6k_f /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py <<< 15406 1726854964.46224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py" <<< 15406 1726854964.46283: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpum5k6k_f" to remote "/root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py" <<< 15406 1726854964.47144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854964.47179: stderr chunk (state=3): >>><<< 15406 1726854964.47189: stdout chunk (state=3): >>><<< 15406 1726854964.47228: done transferring module to remote 15406 1726854964.47250: _low_level_execute_command(): starting 15406 1726854964.47263: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/ /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py && sleep 0' 15406 1726854964.47918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854964.47993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854964.48018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854964.48034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854964.48051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854964.48074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854964.48180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854964.50041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854964.50081: stdout chunk (state=3): >>><<< 15406 1726854964.50084: stderr chunk (state=3): >>><<< 15406 1726854964.50202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854964.50206: _low_level_execute_command(): starting 15406 1726854964.50209: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/AnsiballZ_service_facts.py && sleep 0' 15406 1726854964.51497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854964.51516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854964.51532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854964.51634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.02728: stdout chunk (state=3): >>> <<< 15406 1726854966.02758: stdout chunk (state=3): >>>{"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stop<<< 15406 1726854966.02774: stdout chunk (state=3): >>>ped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "<<< 15406 1726854966.02782: stdout chunk (state=3): >>>systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.se<<< 15406 1726854966.02805: stdout chunk (state=3): >>>rvice", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stop<<< 15406 1726854966.02823: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "in<<< 15406 1726854966.02843: stdout chunk (state=3): >>>active", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "stat<<< 15406 1726854966.02849: stdout chunk (state=3): >>>ic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "sy<<< 15406 1726854966.02863: stdout chunk (state=3): >>>stemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15406 1726854966.04365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854966.04380: stderr chunk (state=3): >>><<< 15406 1726854966.04392: stdout chunk (state=3): >>><<< 15406 1726854966.04599: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854966.09362: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854966.09386: _low_level_execute_command(): starting 15406 1726854966.09400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854964.3961298-16905-103946844997049/ > /dev/null 2>&1 && sleep 0' 15406 1726854966.10454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854966.10458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.10460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.10462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854966.10533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.10610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.12442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854966.12508: stderr chunk (state=3): >>><<< 15406 1726854966.12522: stdout chunk (state=3): >>><<< 15406 1726854966.12549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854966.12563: handler run complete 15406 1726854966.12778: variable 'ansible_facts' from source: unknown 15406 1726854966.12931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854966.13580: variable 'ansible_facts' from source: unknown 15406 1726854966.13726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854966.13951: attempt loop complete, returning result 15406 1726854966.13963: _execute() done 15406 1726854966.13970: dumping result to json 15406 1726854966.14044: done dumping result, returning 15406 1726854966.14060: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-3c83-32d3-0000000003e7] 15406 1726854966.14094: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e7 15406 1726854966.18927: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e7 15406 1726854966.18934: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854966.18995: no more pending results, returning what we have 15406 1726854966.19003: results queue empty 15406 1726854966.19004: checking for any_errors_fatal 15406 1726854966.19006: done checking for any_errors_fatal 15406 1726854966.19007: checking for max_fail_percentage 15406 1726854966.19009: done checking for max_fail_percentage 15406 1726854966.19010: checking to see if all hosts have failed and the running result is not ok 15406 1726854966.19010: done checking to see if all hosts have failed 15406 1726854966.19011: getting the remaining hosts for this loop 15406 1726854966.19012: done getting the remaining hosts for this loop 15406 1726854966.19015: getting the next task for host managed_node2 15406 1726854966.19022: done getting next task for host managed_node2 15406 1726854966.19025: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15406 1726854966.19028: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854966.19041: getting variables 15406 1726854966.19042: in VariableManager get_vars() 15406 1726854966.19059: Calling all_inventory to load vars for managed_node2 15406 1726854966.19061: Calling groups_inventory to load vars for managed_node2 15406 1726854966.19062: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854966.19071: Calling all_plugins_play to load vars for managed_node2 15406 1726854966.19074: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854966.19082: Calling groups_plugins_play to load vars for managed_node2 15406 1726854966.19867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854966.20848: done with get_vars() 15406 1726854966.20865: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:56:06 -0400 (0:00:01.863) 0:00:34.032 ****** 15406 1726854966.20959: entering _queue_task() for managed_node2/package_facts 15406 1726854966.21254: worker is 1 (out of 1 available) 15406 1726854966.21267: exiting _queue_task() for managed_node2/package_facts 15406 1726854966.21281: done queuing things up, now waiting for results queue to drain 15406 1726854966.21282: waiting for pending results... 15406 1726854966.21468: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15406 1726854966.21635: in run() - task 0affcc66-ac2b-3c83-32d3-0000000003e8 15406 1726854966.21639: variable 'ansible_search_path' from source: unknown 15406 1726854966.21642: variable 'ansible_search_path' from source: unknown 15406 1726854966.21661: calling self._execute() 15406 1726854966.21816: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854966.21820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854966.21824: variable 'omit' from source: magic vars 15406 1726854966.22170: variable 'ansible_distribution_major_version' from source: facts 15406 1726854966.22179: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854966.22185: variable 'omit' from source: magic vars 15406 1726854966.22249: variable 'omit' from source: magic vars 15406 1726854966.22322: variable 'omit' from source: magic vars 15406 1726854966.22352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854966.22371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854966.22388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854966.22432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854966.22436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854966.22472: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854966.22477: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854966.22480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854966.22566: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854966.22570: Set connection var ansible_timeout to 10 15406 1726854966.22573: Set connection var ansible_connection to ssh 15406 1726854966.22578: Set connection var ansible_shell_type to sh 15406 1726854966.22621: Set connection var ansible_shell_executable to /bin/sh 15406 1726854966.22626: Set connection var ansible_pipelining to False 15406 1726854966.22629: variable 'ansible_shell_executable' from source: unknown 15406 1726854966.22632: variable 'ansible_connection' from source: unknown 15406 1726854966.22635: variable 'ansible_module_compression' from source: unknown 15406 1726854966.22637: variable 'ansible_shell_type' from source: unknown 15406 1726854966.22640: variable 'ansible_shell_executable' from source: unknown 15406 1726854966.22642: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854966.22644: variable 'ansible_pipelining' from source: unknown 15406 1726854966.22646: variable 'ansible_timeout' from source: unknown 15406 1726854966.22648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854966.22844: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854966.22850: variable 'omit' from source: magic vars 15406 1726854966.22852: starting attempt loop 15406 1726854966.22855: running the handler 15406 1726854966.22992: _low_level_execute_command(): starting 15406 1726854966.22995: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854966.23436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.23465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.23516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854966.23528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.23627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.25249: stdout chunk (state=3): >>>/root <<< 15406 1726854966.25349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854966.25373: stderr chunk (state=3): >>><<< 15406 1726854966.25380: stdout chunk (state=3): >>><<< 15406 1726854966.25407: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854966.25427: _low_level_execute_command(): starting 15406 1726854966.25431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618 `" && echo ansible-tmp-1726854966.254134-16961-244790405076618="` echo /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618 `" ) && sleep 0' 15406 1726854966.25879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.25882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.25886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.25899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.25942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854966.25947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854966.25953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.26024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.27936: stdout chunk (state=3): >>>ansible-tmp-1726854966.254134-16961-244790405076618=/root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618 <<< 15406 1726854966.28143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854966.28147: stdout chunk (state=3): >>><<< 15406 1726854966.28149: stderr chunk (state=3): >>><<< 15406 1726854966.28152: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854966.254134-16961-244790405076618=/root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854966.28255: variable 'ansible_module_compression' from source: unknown 15406 1726854966.28273: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15406 1726854966.28342: variable 'ansible_facts' from source: unknown 15406 1726854966.28556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py 15406 1726854966.28714: Sending initial data 15406 1726854966.28809: Sent initial data (161 bytes) 15406 1726854966.29460: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.29540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854966.29582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854966.29619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.29693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.31238: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15406 1726854966.31241: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854966.31309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854966.31381: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp7qkqfqc1 /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py <<< 15406 1726854966.31383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py" <<< 15406 1726854966.31446: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp7qkqfqc1" to remote "/root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py" <<< 15406 1726854966.31449: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py" <<< 15406 1726854966.32837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854966.32841: stdout chunk (state=3): >>><<< 15406 1726854966.32843: stderr chunk (state=3): >>><<< 15406 1726854966.32845: done transferring module to remote 15406 1726854966.32847: _low_level_execute_command(): starting 15406 1726854966.32850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/ /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py && sleep 0' 15406 1726854966.33246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.33260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854966.33277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.33337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854966.33348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.33427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.35167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854966.35193: stderr chunk (state=3): >>><<< 15406 1726854966.35196: stdout chunk (state=3): >>><<< 15406 1726854966.35213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854966.35216: _low_level_execute_command(): starting 15406 1726854966.35219: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/AnsiballZ_package_facts.py && sleep 0' 15406 1726854966.35645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.35649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854966.35651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15406 1726854966.35654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.35656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.35708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854966.35712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.35794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.79706: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15406 1726854966.79780: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 15406 1726854966.79809: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 15406 1726854966.79843: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 15406 1726854966.79850: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15406 1726854966.81697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854966.81702: stdout chunk (state=3): >>><<< 15406 1726854966.81705: stderr chunk (state=3): >>><<< 15406 1726854966.81718: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854966.84097: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854966.84124: _low_level_execute_command(): starting 15406 1726854966.84171: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854966.254134-16961-244790405076618/ > /dev/null 2>&1 && sleep 0' 15406 1726854966.84722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854966.84740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854966.84753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854966.84771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854966.84797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854966.84811: stderr chunk (state=3): >>>debug2: match not found <<< 15406 1726854966.84825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.84848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854966.84903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854966.84940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854966.84970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854966.84984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854966.85180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854966.87097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854966.87102: stdout chunk (state=3): >>><<< 15406 1726854966.87104: stderr chunk (state=3): >>><<< 15406 1726854966.87106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854966.87108: handler run complete 15406 1726854966.88999: variable 'ansible_facts' from source: unknown 15406 1726854966.89712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854966.91681: variable 'ansible_facts' from source: unknown 15406 1726854966.92153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854966.93149: attempt loop complete, returning result 15406 1726854966.93153: _execute() done 15406 1726854966.93155: dumping result to json 15406 1726854966.93548: done dumping result, returning 15406 1726854966.93602: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-3c83-32d3-0000000003e8] 15406 1726854966.93706: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e8 15406 1726854966.96299: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000003e8 15406 1726854966.96303: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854966.96465: no more pending results, returning what we have 15406 1726854966.96469: results queue empty 15406 1726854966.96469: checking for any_errors_fatal 15406 1726854966.96475: done checking for any_errors_fatal 15406 1726854966.96476: checking for max_fail_percentage 15406 1726854966.96477: done checking for max_fail_percentage 15406 1726854966.96478: checking to see if all hosts have failed and the running result is not ok 15406 1726854966.96479: done checking to see if all hosts have failed 15406 1726854966.96479: getting the remaining hosts for this loop 15406 1726854966.96480: done getting the remaining hosts for this loop 15406 1726854966.96484: getting the next task for host managed_node2 15406 1726854966.96492: done getting next task for host managed_node2 15406 1726854966.96498: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15406 1726854966.96500: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854966.96511: getting variables 15406 1726854966.96513: in VariableManager get_vars() 15406 1726854966.96543: Calling all_inventory to load vars for managed_node2 15406 1726854966.96546: Calling groups_inventory to load vars for managed_node2 15406 1726854966.96548: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854966.96557: Calling all_plugins_play to load vars for managed_node2 15406 1726854966.96560: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854966.96562: Calling groups_plugins_play to load vars for managed_node2 15406 1726854966.97481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854966.98503: done with get_vars() 15406 1726854966.98523: done getting variables 15406 1726854966.98568: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:56:06 -0400 (0:00:00.776) 0:00:34.808 ****** 15406 1726854966.98598: entering _queue_task() for managed_node2/debug 15406 1726854966.98870: worker is 1 (out of 1 available) 15406 1726854966.98882: exiting _queue_task() for managed_node2/debug 15406 1726854966.98895: done queuing things up, now waiting for results queue to drain 15406 1726854966.98896: waiting for pending results... 15406 1726854966.99306: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 15406 1726854966.99311: in run() - task 0affcc66-ac2b-3c83-32d3-00000000005b 15406 1726854966.99315: variable 'ansible_search_path' from source: unknown 15406 1726854966.99318: variable 'ansible_search_path' from source: unknown 15406 1726854966.99365: calling self._execute() 15406 1726854966.99475: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854966.99489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854966.99505: variable 'omit' from source: magic vars 15406 1726854966.99916: variable 'ansible_distribution_major_version' from source: facts 15406 1726854966.99934: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854966.99945: variable 'omit' from source: magic vars 15406 1726854966.99997: variable 'omit' from source: magic vars 15406 1726854967.00195: variable 'network_provider' from source: set_fact 15406 1726854967.00198: variable 'omit' from source: magic vars 15406 1726854967.00201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854967.00208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854967.00234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854967.00257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854967.00273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854967.00313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854967.00322: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.00330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.00435: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854967.00446: Set connection var ansible_timeout to 10 15406 1726854967.00454: Set connection var ansible_connection to ssh 15406 1726854967.00464: Set connection var ansible_shell_type to sh 15406 1726854967.00474: Set connection var ansible_shell_executable to /bin/sh 15406 1726854967.00485: Set connection var ansible_pipelining to False 15406 1726854967.00519: variable 'ansible_shell_executable' from source: unknown 15406 1726854967.00529: variable 'ansible_connection' from source: unknown 15406 1726854967.00537: variable 'ansible_module_compression' from source: unknown 15406 1726854967.00543: variable 'ansible_shell_type' from source: unknown 15406 1726854967.00550: variable 'ansible_shell_executable' from source: unknown 15406 1726854967.00559: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.00629: variable 'ansible_pipelining' from source: unknown 15406 1726854967.00632: variable 'ansible_timeout' from source: unknown 15406 1726854967.00634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.00729: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854967.00753: variable 'omit' from source: magic vars 15406 1726854967.00764: starting attempt loop 15406 1726854967.00771: running the handler 15406 1726854967.00822: handler run complete 15406 1726854967.00846: attempt loop complete, returning result 15406 1726854967.00857: _execute() done 15406 1726854967.00864: dumping result to json 15406 1726854967.00872: done dumping result, returning 15406 1726854967.00963: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-3c83-32d3-00000000005b] 15406 1726854967.00967: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005b 15406 1726854967.01038: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005b 15406 1726854967.01041: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 15406 1726854967.01126: no more pending results, returning what we have 15406 1726854967.01130: results queue empty 15406 1726854967.01131: checking for any_errors_fatal 15406 1726854967.01142: done checking for any_errors_fatal 15406 1726854967.01143: checking for max_fail_percentage 15406 1726854967.01145: done checking for max_fail_percentage 15406 1726854967.01146: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.01147: done checking to see if all hosts have failed 15406 1726854967.01148: getting the remaining hosts for this loop 15406 1726854967.01149: done getting the remaining hosts for this loop 15406 1726854967.01153: getting the next task for host managed_node2 15406 1726854967.01159: done getting next task for host managed_node2 15406 1726854967.01163: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15406 1726854967.01165: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.01174: getting variables 15406 1726854967.01176: in VariableManager get_vars() 15406 1726854967.01218: Calling all_inventory to load vars for managed_node2 15406 1726854967.01221: Calling groups_inventory to load vars for managed_node2 15406 1726854967.01224: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.01235: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.01238: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.01241: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.02911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.04503: done with get_vars() 15406 1726854967.04534: done getting variables 15406 1726854967.04593: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:56:07 -0400 (0:00:00.060) 0:00:34.868 ****** 15406 1726854967.04627: entering _queue_task() for managed_node2/fail 15406 1726854967.05216: worker is 1 (out of 1 available) 15406 1726854967.05225: exiting _queue_task() for managed_node2/fail 15406 1726854967.05235: done queuing things up, now waiting for results queue to drain 15406 1726854967.05236: waiting for pending results... 15406 1726854967.05364: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15406 1726854967.05420: in run() - task 0affcc66-ac2b-3c83-32d3-00000000005c 15406 1726854967.05442: variable 'ansible_search_path' from source: unknown 15406 1726854967.05449: variable 'ansible_search_path' from source: unknown 15406 1726854967.05496: calling self._execute() 15406 1726854967.05601: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.05613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.05625: variable 'omit' from source: magic vars 15406 1726854967.06015: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.06029: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.06146: variable 'network_state' from source: role '' defaults 15406 1726854967.06192: Evaluated conditional (network_state != {}): False 15406 1726854967.06195: when evaluation is False, skipping this task 15406 1726854967.06198: _execute() done 15406 1726854967.06201: dumping result to json 15406 1726854967.06203: done dumping result, returning 15406 1726854967.06205: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-3c83-32d3-00000000005c] 15406 1726854967.06208: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854967.06371: no more pending results, returning what we have 15406 1726854967.06374: results queue empty 15406 1726854967.06375: checking for any_errors_fatal 15406 1726854967.06383: done checking for any_errors_fatal 15406 1726854967.06384: checking for max_fail_percentage 15406 1726854967.06386: done checking for max_fail_percentage 15406 1726854967.06386: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.06389: done checking to see if all hosts have failed 15406 1726854967.06390: getting the remaining hosts for this loop 15406 1726854967.06391: done getting the remaining hosts for this loop 15406 1726854967.06394: getting the next task for host managed_node2 15406 1726854967.06400: done getting next task for host managed_node2 15406 1726854967.06404: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15406 1726854967.06407: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.06422: getting variables 15406 1726854967.06424: in VariableManager get_vars() 15406 1726854967.06463: Calling all_inventory to load vars for managed_node2 15406 1726854967.06466: Calling groups_inventory to load vars for managed_node2 15406 1726854967.06468: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.06480: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.06483: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.06486: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.07400: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005c 15406 1726854967.07404: WORKER PROCESS EXITING 15406 1726854967.08008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.09486: done with get_vars() 15406 1726854967.09511: done getting variables 15406 1726854967.09564: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:56:07 -0400 (0:00:00.049) 0:00:34.918 ****** 15406 1726854967.09599: entering _queue_task() for managed_node2/fail 15406 1726854967.10022: worker is 1 (out of 1 available) 15406 1726854967.10034: exiting _queue_task() for managed_node2/fail 15406 1726854967.10045: done queuing things up, now waiting for results queue to drain 15406 1726854967.10045: waiting for pending results... 15406 1726854967.10237: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15406 1726854967.10351: in run() - task 0affcc66-ac2b-3c83-32d3-00000000005d 15406 1726854967.10370: variable 'ansible_search_path' from source: unknown 15406 1726854967.10384: variable 'ansible_search_path' from source: unknown 15406 1726854967.10427: calling self._execute() 15406 1726854967.10532: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.10545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.10561: variable 'omit' from source: magic vars 15406 1726854967.10952: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.10970: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.11095: variable 'network_state' from source: role '' defaults 15406 1726854967.11110: Evaluated conditional (network_state != {}): False 15406 1726854967.11118: when evaluation is False, skipping this task 15406 1726854967.11125: _execute() done 15406 1726854967.11133: dumping result to json 15406 1726854967.11144: done dumping result, returning 15406 1726854967.11155: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-3c83-32d3-00000000005d] 15406 1726854967.11165: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854967.11314: no more pending results, returning what we have 15406 1726854967.11318: results queue empty 15406 1726854967.11319: checking for any_errors_fatal 15406 1726854967.11327: done checking for any_errors_fatal 15406 1726854967.11328: checking for max_fail_percentage 15406 1726854967.11330: done checking for max_fail_percentage 15406 1726854967.11331: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.11332: done checking to see if all hosts have failed 15406 1726854967.11332: getting the remaining hosts for this loop 15406 1726854967.11334: done getting the remaining hosts for this loop 15406 1726854967.11338: getting the next task for host managed_node2 15406 1726854967.11344: done getting next task for host managed_node2 15406 1726854967.11347: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15406 1726854967.11350: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.11365: getting variables 15406 1726854967.11367: in VariableManager get_vars() 15406 1726854967.11408: Calling all_inventory to load vars for managed_node2 15406 1726854967.11411: Calling groups_inventory to load vars for managed_node2 15406 1726854967.11414: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.11425: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.11428: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.11432: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.12201: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005d 15406 1726854967.12205: WORKER PROCESS EXITING 15406 1726854967.13159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.14700: done with get_vars() 15406 1726854967.14729: done getting variables 15406 1726854967.14785: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:56:07 -0400 (0:00:00.053) 0:00:34.971 ****** 15406 1726854967.14908: entering _queue_task() for managed_node2/fail 15406 1726854967.15634: worker is 1 (out of 1 available) 15406 1726854967.15650: exiting _queue_task() for managed_node2/fail 15406 1726854967.15661: done queuing things up, now waiting for results queue to drain 15406 1726854967.15662: waiting for pending results... 15406 1726854967.16119: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15406 1726854967.16440: in run() - task 0affcc66-ac2b-3c83-32d3-00000000005e 15406 1726854967.16639: variable 'ansible_search_path' from source: unknown 15406 1726854967.16643: variable 'ansible_search_path' from source: unknown 15406 1726854967.16697: calling self._execute() 15406 1726854967.17441: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.17444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.17447: variable 'omit' from source: magic vars 15406 1726854967.17927: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.17944: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.18265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.22908: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.22983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.23036: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.23075: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.23118: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.23298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.23302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.23305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.23341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.23363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.23478: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.23504: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15406 1726854967.23632: variable 'ansible_distribution' from source: facts 15406 1726854967.23646: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.23662: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15406 1726854967.23939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.23972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.24005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.24048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.24065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.24193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.24198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.24201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.24220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.24234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.24277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.24317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.24345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.24384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.24411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.24742: variable 'network_connections' from source: play vars 15406 1726854967.24758: variable 'profile' from source: play vars 15406 1726854967.24831: variable 'profile' from source: play vars 15406 1726854967.24846: variable 'interface' from source: set_fact 15406 1726854967.24912: variable 'interface' from source: set_fact 15406 1726854967.24928: variable 'network_state' from source: role '' defaults 15406 1726854967.25006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854967.25184: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854967.25231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854967.25267: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854967.25391: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854967.25397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854967.25408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854967.25424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.25454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854967.25483: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15406 1726854967.25500: when evaluation is False, skipping this task 15406 1726854967.25509: _execute() done 15406 1726854967.25520: dumping result to json 15406 1726854967.25529: done dumping result, returning 15406 1726854967.25541: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-3c83-32d3-00000000005e] 15406 1726854967.25550: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005e 15406 1726854967.25755: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005e 15406 1726854967.25759: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15406 1726854967.25805: no more pending results, returning what we have 15406 1726854967.25808: results queue empty 15406 1726854967.25809: checking for any_errors_fatal 15406 1726854967.25817: done checking for any_errors_fatal 15406 1726854967.25817: checking for max_fail_percentage 15406 1726854967.25819: done checking for max_fail_percentage 15406 1726854967.25820: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.25821: done checking to see if all hosts have failed 15406 1726854967.25821: getting the remaining hosts for this loop 15406 1726854967.25823: done getting the remaining hosts for this loop 15406 1726854967.25826: getting the next task for host managed_node2 15406 1726854967.25832: done getting next task for host managed_node2 15406 1726854967.25835: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15406 1726854967.25837: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.25850: getting variables 15406 1726854967.25851: in VariableManager get_vars() 15406 1726854967.25891: Calling all_inventory to load vars for managed_node2 15406 1726854967.25893: Calling groups_inventory to load vars for managed_node2 15406 1726854967.25895: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.25905: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.25908: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.25910: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.27362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.29189: done with get_vars() 15406 1726854967.29216: done getting variables 15406 1726854967.29277: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:56:07 -0400 (0:00:00.144) 0:00:35.115 ****** 15406 1726854967.29316: entering _queue_task() for managed_node2/dnf 15406 1726854967.29676: worker is 1 (out of 1 available) 15406 1726854967.29838: exiting _queue_task() for managed_node2/dnf 15406 1726854967.29850: done queuing things up, now waiting for results queue to drain 15406 1726854967.29852: waiting for pending results... 15406 1726854967.30017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15406 1726854967.30167: in run() - task 0affcc66-ac2b-3c83-32d3-00000000005f 15406 1726854967.30171: variable 'ansible_search_path' from source: unknown 15406 1726854967.30174: variable 'ansible_search_path' from source: unknown 15406 1726854967.30213: calling self._execute() 15406 1726854967.30383: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.30389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.30392: variable 'omit' from source: magic vars 15406 1726854967.30770: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.30790: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.31015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.33380: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.33469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.33515: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.33562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.33599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.33899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.33902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.33904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.33907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.33909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.33951: variable 'ansible_distribution' from source: facts 15406 1726854967.33961: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.33978: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15406 1726854967.34103: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.34263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.34297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.34344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.34414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.34426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.34493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.34502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.34520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.34547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.34558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.34589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.34606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.34623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.34661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.34672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.34773: variable 'network_connections' from source: play vars 15406 1726854967.34783: variable 'profile' from source: play vars 15406 1726854967.34830: variable 'profile' from source: play vars 15406 1726854967.34833: variable 'interface' from source: set_fact 15406 1726854967.34875: variable 'interface' from source: set_fact 15406 1726854967.34928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854967.35037: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854967.35064: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854967.35088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854967.35112: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854967.35144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854967.35159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854967.35180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.35200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854967.35236: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854967.35392: variable 'network_connections' from source: play vars 15406 1726854967.35399: variable 'profile' from source: play vars 15406 1726854967.35446: variable 'profile' from source: play vars 15406 1726854967.35450: variable 'interface' from source: set_fact 15406 1726854967.35491: variable 'interface' from source: set_fact 15406 1726854967.35510: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854967.35513: when evaluation is False, skipping this task 15406 1726854967.35516: _execute() done 15406 1726854967.35518: dumping result to json 15406 1726854967.35520: done dumping result, returning 15406 1726854967.35528: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-00000000005f] 15406 1726854967.35537: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005f 15406 1726854967.35622: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000005f 15406 1726854967.35624: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854967.35686: no more pending results, returning what we have 15406 1726854967.35691: results queue empty 15406 1726854967.35692: checking for any_errors_fatal 15406 1726854967.35704: done checking for any_errors_fatal 15406 1726854967.35705: checking for max_fail_percentage 15406 1726854967.35706: done checking for max_fail_percentage 15406 1726854967.35707: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.35708: done checking to see if all hosts have failed 15406 1726854967.35709: getting the remaining hosts for this loop 15406 1726854967.35710: done getting the remaining hosts for this loop 15406 1726854967.35713: getting the next task for host managed_node2 15406 1726854967.35719: done getting next task for host managed_node2 15406 1726854967.35723: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15406 1726854967.35725: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.35738: getting variables 15406 1726854967.35739: in VariableManager get_vars() 15406 1726854967.35775: Calling all_inventory to load vars for managed_node2 15406 1726854967.35777: Calling groups_inventory to load vars for managed_node2 15406 1726854967.35780: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.35791: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.35793: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.35798: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.36923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.38323: done with get_vars() 15406 1726854967.38348: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15406 1726854967.38425: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:56:07 -0400 (0:00:00.091) 0:00:35.207 ****** 15406 1726854967.38454: entering _queue_task() for managed_node2/yum 15406 1726854967.38780: worker is 1 (out of 1 available) 15406 1726854967.38793: exiting _queue_task() for managed_node2/yum 15406 1726854967.38805: done queuing things up, now waiting for results queue to drain 15406 1726854967.38806: waiting for pending results... 15406 1726854967.39079: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15406 1726854967.39164: in run() - task 0affcc66-ac2b-3c83-32d3-000000000060 15406 1726854967.39176: variable 'ansible_search_path' from source: unknown 15406 1726854967.39179: variable 'ansible_search_path' from source: unknown 15406 1726854967.39250: calling self._execute() 15406 1726854967.39285: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.39291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.39303: variable 'omit' from source: magic vars 15406 1726854967.39582: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.39592: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.39712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.41361: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.41593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.41596: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.41599: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.41601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.41626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.41700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.41711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.41739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.41750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.41823: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.41833: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15406 1726854967.41836: when evaluation is False, skipping this task 15406 1726854967.41839: _execute() done 15406 1726854967.41841: dumping result to json 15406 1726854967.41844: done dumping result, returning 15406 1726854967.41852: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000060] 15406 1726854967.41856: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000060 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15406 1726854967.41997: no more pending results, returning what we have 15406 1726854967.42000: results queue empty 15406 1726854967.42001: checking for any_errors_fatal 15406 1726854967.42008: done checking for any_errors_fatal 15406 1726854967.42009: checking for max_fail_percentage 15406 1726854967.42010: done checking for max_fail_percentage 15406 1726854967.42011: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.42012: done checking to see if all hosts have failed 15406 1726854967.42013: getting the remaining hosts for this loop 15406 1726854967.42014: done getting the remaining hosts for this loop 15406 1726854967.42017: getting the next task for host managed_node2 15406 1726854967.42026: done getting next task for host managed_node2 15406 1726854967.42029: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15406 1726854967.42031: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.42042: getting variables 15406 1726854967.42044: in VariableManager get_vars() 15406 1726854967.42078: Calling all_inventory to load vars for managed_node2 15406 1726854967.42081: Calling groups_inventory to load vars for managed_node2 15406 1726854967.42083: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.42101: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.42105: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.42110: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000060 15406 1726854967.42112: WORKER PROCESS EXITING 15406 1726854967.42115: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.43027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.43939: done with get_vars() 15406 1726854967.43969: done getting variables 15406 1726854967.44035: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:56:07 -0400 (0:00:00.056) 0:00:35.263 ****** 15406 1726854967.44079: entering _queue_task() for managed_node2/fail 15406 1726854967.44463: worker is 1 (out of 1 available) 15406 1726854967.44482: exiting _queue_task() for managed_node2/fail 15406 1726854967.44499: done queuing things up, now waiting for results queue to drain 15406 1726854967.44500: waiting for pending results... 15406 1726854967.44771: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15406 1726854967.44884: in run() - task 0affcc66-ac2b-3c83-32d3-000000000061 15406 1726854967.44901: variable 'ansible_search_path' from source: unknown 15406 1726854967.44910: variable 'ansible_search_path' from source: unknown 15406 1726854967.44932: calling self._execute() 15406 1726854967.45015: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.45020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.45029: variable 'omit' from source: magic vars 15406 1726854967.45492: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.45498: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.45660: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.45765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.48048: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.48108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.48146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.48176: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.48212: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.48364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.48371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.48375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.48406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.48417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.48454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.48469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.48488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.48544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.48555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.48590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.48607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.48624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.48649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.48659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.48775: variable 'network_connections' from source: play vars 15406 1726854967.48793: variable 'profile' from source: play vars 15406 1726854967.48863: variable 'profile' from source: play vars 15406 1726854967.48866: variable 'interface' from source: set_fact 15406 1726854967.48920: variable 'interface' from source: set_fact 15406 1726854967.49030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854967.49162: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854967.49189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854967.49215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854967.49239: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854967.49270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854967.49292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854967.49415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.49418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854967.49423: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854967.49581: variable 'network_connections' from source: play vars 15406 1726854967.49585: variable 'profile' from source: play vars 15406 1726854967.49631: variable 'profile' from source: play vars 15406 1726854967.49635: variable 'interface' from source: set_fact 15406 1726854967.49678: variable 'interface' from source: set_fact 15406 1726854967.49701: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854967.49704: when evaluation is False, skipping this task 15406 1726854967.49707: _execute() done 15406 1726854967.49709: dumping result to json 15406 1726854967.49712: done dumping result, returning 15406 1726854967.49720: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000061] 15406 1726854967.49731: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000061 15406 1726854967.49816: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000061 15406 1726854967.49819: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854967.49869: no more pending results, returning what we have 15406 1726854967.49873: results queue empty 15406 1726854967.49874: checking for any_errors_fatal 15406 1726854967.49880: done checking for any_errors_fatal 15406 1726854967.49881: checking for max_fail_percentage 15406 1726854967.49882: done checking for max_fail_percentage 15406 1726854967.49884: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.49885: done checking to see if all hosts have failed 15406 1726854967.49886: getting the remaining hosts for this loop 15406 1726854967.49889: done getting the remaining hosts for this loop 15406 1726854967.49892: getting the next task for host managed_node2 15406 1726854967.49898: done getting next task for host managed_node2 15406 1726854967.49902: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15406 1726854967.49904: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.49915: getting variables 15406 1726854967.49917: in VariableManager get_vars() 15406 1726854967.49961: Calling all_inventory to load vars for managed_node2 15406 1726854967.49964: Calling groups_inventory to load vars for managed_node2 15406 1726854967.49966: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.49976: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.49978: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.49981: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.50826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.51694: done with get_vars() 15406 1726854967.51712: done getting variables 15406 1726854967.51754: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:56:07 -0400 (0:00:00.077) 0:00:35.340 ****** 15406 1726854967.51779: entering _queue_task() for managed_node2/package 15406 1726854967.52156: worker is 1 (out of 1 available) 15406 1726854967.52171: exiting _queue_task() for managed_node2/package 15406 1726854967.52186: done queuing things up, now waiting for results queue to drain 15406 1726854967.52189: waiting for pending results... 15406 1726854967.52472: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 15406 1726854967.52613: in run() - task 0affcc66-ac2b-3c83-32d3-000000000062 15406 1726854967.52617: variable 'ansible_search_path' from source: unknown 15406 1726854967.52620: variable 'ansible_search_path' from source: unknown 15406 1726854967.52652: calling self._execute() 15406 1726854967.52801: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.52805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.52808: variable 'omit' from source: magic vars 15406 1726854967.53367: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.53374: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.53482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854967.53723: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854967.53764: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854967.53811: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854967.53951: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854967.54006: variable 'network_packages' from source: role '' defaults 15406 1726854967.54183: variable '__network_provider_setup' from source: role '' defaults 15406 1726854967.54186: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854967.54192: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854967.54201: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854967.54269: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854967.54433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.56125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.56175: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.56202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.56230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.56248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.56332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.56355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.56374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.56404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.56416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.56451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.56465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.56483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.56511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.56521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.56655: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15406 1726854967.56728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.56746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.56780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.56806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.56817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.56874: variable 'ansible_python' from source: facts 15406 1726854967.56899: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15406 1726854967.56951: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854967.57008: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854967.57088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.57108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.57128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.57151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.57162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.57216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.57226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.57243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.57280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.57296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.57448: variable 'network_connections' from source: play vars 15406 1726854967.57451: variable 'profile' from source: play vars 15406 1726854967.57539: variable 'profile' from source: play vars 15406 1726854967.57543: variable 'interface' from source: set_fact 15406 1726854967.57664: variable 'interface' from source: set_fact 15406 1726854967.57716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854967.57727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854967.57747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.57768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854967.57807: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.58000: variable 'network_connections' from source: play vars 15406 1726854967.58004: variable 'profile' from source: play vars 15406 1726854967.58071: variable 'profile' from source: play vars 15406 1726854967.58075: variable 'interface' from source: set_fact 15406 1726854967.58127: variable 'interface' from source: set_fact 15406 1726854967.58165: variable '__network_packages_default_wireless' from source: role '' defaults 15406 1726854967.58243: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.58506: variable 'network_connections' from source: play vars 15406 1726854967.58510: variable 'profile' from source: play vars 15406 1726854967.58598: variable 'profile' from source: play vars 15406 1726854967.58602: variable 'interface' from source: set_fact 15406 1726854967.58684: variable 'interface' from source: set_fact 15406 1726854967.58716: variable '__network_packages_default_team' from source: role '' defaults 15406 1726854967.58772: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854967.59064: variable 'network_connections' from source: play vars 15406 1726854967.59068: variable 'profile' from source: play vars 15406 1726854967.59250: variable 'profile' from source: play vars 15406 1726854967.59253: variable 'interface' from source: set_fact 15406 1726854967.59256: variable 'interface' from source: set_fact 15406 1726854967.59347: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854967.59381: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854967.59390: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854967.59453: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854967.59708: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15406 1726854967.60023: variable 'network_connections' from source: play vars 15406 1726854967.60027: variable 'profile' from source: play vars 15406 1726854967.60069: variable 'profile' from source: play vars 15406 1726854967.60073: variable 'interface' from source: set_fact 15406 1726854967.60130: variable 'interface' from source: set_fact 15406 1726854967.60163: variable 'ansible_distribution' from source: facts 15406 1726854967.60169: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.60171: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.60173: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15406 1726854967.60349: variable 'ansible_distribution' from source: facts 15406 1726854967.60353: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.60356: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.60374: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15406 1726854967.60517: variable 'ansible_distribution' from source: facts 15406 1726854967.60520: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.60531: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.60588: variable 'network_provider' from source: set_fact 15406 1726854967.60592: variable 'ansible_facts' from source: unknown 15406 1726854967.61012: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15406 1726854967.61015: when evaluation is False, skipping this task 15406 1726854967.61020: _execute() done 15406 1726854967.61025: dumping result to json 15406 1726854967.61028: done dumping result, returning 15406 1726854967.61098: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-3c83-32d3-000000000062] 15406 1726854967.61101: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000062 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15406 1726854967.61288: no more pending results, returning what we have 15406 1726854967.61292: results queue empty 15406 1726854967.61294: checking for any_errors_fatal 15406 1726854967.61306: done checking for any_errors_fatal 15406 1726854967.61310: checking for max_fail_percentage 15406 1726854967.61313: done checking for max_fail_percentage 15406 1726854967.61314: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.61315: done checking to see if all hosts have failed 15406 1726854967.61315: getting the remaining hosts for this loop 15406 1726854967.61317: done getting the remaining hosts for this loop 15406 1726854967.61321: getting the next task for host managed_node2 15406 1726854967.61335: done getting next task for host managed_node2 15406 1726854967.61340: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15406 1726854967.61343: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.61358: getting variables 15406 1726854967.61360: in VariableManager get_vars() 15406 1726854967.61416: Calling all_inventory to load vars for managed_node2 15406 1726854967.61419: Calling groups_inventory to load vars for managed_node2 15406 1726854967.61421: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.61430: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000062 15406 1726854967.61436: WORKER PROCESS EXITING 15406 1726854967.61448: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.61451: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.61454: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.63140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.65131: done with get_vars() 15406 1726854967.65174: done getting variables 15406 1726854967.65266: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:56:07 -0400 (0:00:00.135) 0:00:35.475 ****** 15406 1726854967.65302: entering _queue_task() for managed_node2/package 15406 1726854967.65759: worker is 1 (out of 1 available) 15406 1726854967.65776: exiting _queue_task() for managed_node2/package 15406 1726854967.65791: done queuing things up, now waiting for results queue to drain 15406 1726854967.65793: waiting for pending results... 15406 1726854967.66272: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15406 1726854967.66444: in run() - task 0affcc66-ac2b-3c83-32d3-000000000063 15406 1726854967.66449: variable 'ansible_search_path' from source: unknown 15406 1726854967.66452: variable 'ansible_search_path' from source: unknown 15406 1726854967.66505: calling self._execute() 15406 1726854967.66607: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.66610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.66613: variable 'omit' from source: magic vars 15406 1726854967.66970: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.66978: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.67085: variable 'network_state' from source: role '' defaults 15406 1726854967.67091: Evaluated conditional (network_state != {}): False 15406 1726854967.67093: when evaluation is False, skipping this task 15406 1726854967.67096: _execute() done 15406 1726854967.67099: dumping result to json 15406 1726854967.67104: done dumping result, returning 15406 1726854967.67118: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-3c83-32d3-000000000063] 15406 1726854967.67121: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000063 15406 1726854967.67232: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000063 15406 1726854967.67235: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854967.67303: no more pending results, returning what we have 15406 1726854967.67307: results queue empty 15406 1726854967.67308: checking for any_errors_fatal 15406 1726854967.67315: done checking for any_errors_fatal 15406 1726854967.67315: checking for max_fail_percentage 15406 1726854967.67317: done checking for max_fail_percentage 15406 1726854967.67318: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.67319: done checking to see if all hosts have failed 15406 1726854967.67320: getting the remaining hosts for this loop 15406 1726854967.67321: done getting the remaining hosts for this loop 15406 1726854967.67325: getting the next task for host managed_node2 15406 1726854967.67333: done getting next task for host managed_node2 15406 1726854967.67337: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15406 1726854967.67339: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.67352: getting variables 15406 1726854967.67353: in VariableManager get_vars() 15406 1726854967.67388: Calling all_inventory to load vars for managed_node2 15406 1726854967.67391: Calling groups_inventory to load vars for managed_node2 15406 1726854967.67393: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.67402: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.67404: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.67407: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.68251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.69902: done with get_vars() 15406 1726854967.69931: done getting variables 15406 1726854967.69991: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:56:07 -0400 (0:00:00.047) 0:00:35.523 ****** 15406 1726854967.70033: entering _queue_task() for managed_node2/package 15406 1726854967.70421: worker is 1 (out of 1 available) 15406 1726854967.70435: exiting _queue_task() for managed_node2/package 15406 1726854967.70451: done queuing things up, now waiting for results queue to drain 15406 1726854967.70452: waiting for pending results... 15406 1726854967.70817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15406 1726854967.70913: in run() - task 0affcc66-ac2b-3c83-32d3-000000000064 15406 1726854967.70917: variable 'ansible_search_path' from source: unknown 15406 1726854967.70920: variable 'ansible_search_path' from source: unknown 15406 1726854967.70956: calling self._execute() 15406 1726854967.71067: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.71296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.71300: variable 'omit' from source: magic vars 15406 1726854967.71528: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.71542: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.71660: variable 'network_state' from source: role '' defaults 15406 1726854967.71678: Evaluated conditional (network_state != {}): False 15406 1726854967.71692: when evaluation is False, skipping this task 15406 1726854967.71709: _execute() done 15406 1726854967.71722: dumping result to json 15406 1726854967.71732: done dumping result, returning 15406 1726854967.71750: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-3c83-32d3-000000000064] 15406 1726854967.71758: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000064 15406 1726854967.71992: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000064 15406 1726854967.71998: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854967.72052: no more pending results, returning what we have 15406 1726854967.72056: results queue empty 15406 1726854967.72057: checking for any_errors_fatal 15406 1726854967.72066: done checking for any_errors_fatal 15406 1726854967.72066: checking for max_fail_percentage 15406 1726854967.72068: done checking for max_fail_percentage 15406 1726854967.72069: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.72070: done checking to see if all hosts have failed 15406 1726854967.72070: getting the remaining hosts for this loop 15406 1726854967.72072: done getting the remaining hosts for this loop 15406 1726854967.72075: getting the next task for host managed_node2 15406 1726854967.72082: done getting next task for host managed_node2 15406 1726854967.72085: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15406 1726854967.72089: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.72110: getting variables 15406 1726854967.72115: in VariableManager get_vars() 15406 1726854967.72162: Calling all_inventory to load vars for managed_node2 15406 1726854967.72165: Calling groups_inventory to load vars for managed_node2 15406 1726854967.72168: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.72180: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.72188: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.72193: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.73912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.75540: done with get_vars() 15406 1726854967.75565: done getting variables 15406 1726854967.75628: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:56:07 -0400 (0:00:00.056) 0:00:35.579 ****** 15406 1726854967.75661: entering _queue_task() for managed_node2/service 15406 1726854967.76008: worker is 1 (out of 1 available) 15406 1726854967.76021: exiting _queue_task() for managed_node2/service 15406 1726854967.76032: done queuing things up, now waiting for results queue to drain 15406 1726854967.76034: waiting for pending results... 15406 1726854967.76415: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15406 1726854967.76435: in run() - task 0affcc66-ac2b-3c83-32d3-000000000065 15406 1726854967.76452: variable 'ansible_search_path' from source: unknown 15406 1726854967.76458: variable 'ansible_search_path' from source: unknown 15406 1726854967.76507: calling self._execute() 15406 1726854967.76627: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.76640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.76655: variable 'omit' from source: magic vars 15406 1726854967.77292: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.77299: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.77302: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.77422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.79584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.79634: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.79661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.79690: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.79714: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.79773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.79811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.79831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.79856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.79867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.79907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.79924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.79940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.79965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.79975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.80011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.80027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.80044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.80068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.80078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.80202: variable 'network_connections' from source: play vars 15406 1726854967.80214: variable 'profile' from source: play vars 15406 1726854967.80265: variable 'profile' from source: play vars 15406 1726854967.80268: variable 'interface' from source: set_fact 15406 1726854967.80316: variable 'interface' from source: set_fact 15406 1726854967.80368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854967.80482: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854967.80512: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854967.80535: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854967.80559: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854967.80591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854967.80609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854967.80627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.80644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854967.80685: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854967.80840: variable 'network_connections' from source: play vars 15406 1726854967.80843: variable 'profile' from source: play vars 15406 1726854967.80892: variable 'profile' from source: play vars 15406 1726854967.80895: variable 'interface' from source: set_fact 15406 1726854967.80951: variable 'interface' from source: set_fact 15406 1726854967.80978: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15406 1726854967.80985: when evaluation is False, skipping this task 15406 1726854967.80989: _execute() done 15406 1726854967.80991: dumping result to json 15406 1726854967.80993: done dumping result, returning 15406 1726854967.81092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-3c83-32d3-000000000065] 15406 1726854967.81106: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000065 15406 1726854967.81175: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000065 15406 1726854967.81178: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15406 1726854967.81285: no more pending results, returning what we have 15406 1726854967.81291: results queue empty 15406 1726854967.81292: checking for any_errors_fatal 15406 1726854967.81298: done checking for any_errors_fatal 15406 1726854967.81298: checking for max_fail_percentage 15406 1726854967.81300: done checking for max_fail_percentage 15406 1726854967.81301: checking to see if all hosts have failed and the running result is not ok 15406 1726854967.81302: done checking to see if all hosts have failed 15406 1726854967.81302: getting the remaining hosts for this loop 15406 1726854967.81304: done getting the remaining hosts for this loop 15406 1726854967.81307: getting the next task for host managed_node2 15406 1726854967.81314: done getting next task for host managed_node2 15406 1726854967.81318: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15406 1726854967.81320: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854967.81333: getting variables 15406 1726854967.81334: in VariableManager get_vars() 15406 1726854967.81377: Calling all_inventory to load vars for managed_node2 15406 1726854967.81380: Calling groups_inventory to load vars for managed_node2 15406 1726854967.81381: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854967.81394: Calling all_plugins_play to load vars for managed_node2 15406 1726854967.81397: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854967.81400: Calling groups_plugins_play to load vars for managed_node2 15406 1726854967.82557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854967.83545: done with get_vars() 15406 1726854967.83562: done getting variables 15406 1726854967.83611: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:56:07 -0400 (0:00:00.079) 0:00:35.659 ****** 15406 1726854967.83634: entering _queue_task() for managed_node2/service 15406 1726854967.83881: worker is 1 (out of 1 available) 15406 1726854967.83900: exiting _queue_task() for managed_node2/service 15406 1726854967.83911: done queuing things up, now waiting for results queue to drain 15406 1726854967.83913: waiting for pending results... 15406 1726854967.84086: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15406 1726854967.84176: in run() - task 0affcc66-ac2b-3c83-32d3-000000000066 15406 1726854967.84392: variable 'ansible_search_path' from source: unknown 15406 1726854967.84398: variable 'ansible_search_path' from source: unknown 15406 1726854967.84401: calling self._execute() 15406 1726854967.84405: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.84407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.84409: variable 'omit' from source: magic vars 15406 1726854967.84797: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.84815: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854967.85002: variable 'network_provider' from source: set_fact 15406 1726854967.85014: variable 'network_state' from source: role '' defaults 15406 1726854967.85029: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15406 1726854967.85041: variable 'omit' from source: magic vars 15406 1726854967.85103: variable 'omit' from source: magic vars 15406 1726854967.85134: variable 'network_service_name' from source: role '' defaults 15406 1726854967.85213: variable 'network_service_name' from source: role '' defaults 15406 1726854967.85330: variable '__network_provider_setup' from source: role '' defaults 15406 1726854967.85342: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854967.85420: variable '__network_service_name_default_nm' from source: role '' defaults 15406 1726854967.85435: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854967.85593: variable '__network_packages_default_nm' from source: role '' defaults 15406 1726854967.85745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854967.87223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854967.87286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854967.87344: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854967.87362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854967.87432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854967.87503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.87511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.87532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.87693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.87698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.87701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.87702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.87704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.87735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.87756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.88003: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15406 1726854967.88127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.88157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.88183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.88240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.88259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.88353: variable 'ansible_python' from source: facts 15406 1726854967.88386: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15406 1726854967.88471: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854967.88551: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854967.88681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.88705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.88723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.88747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.88758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.88871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854967.88882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854967.88901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.88905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854967.88907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854967.89017: variable 'network_connections' from source: play vars 15406 1726854967.89023: variable 'profile' from source: play vars 15406 1726854967.89084: variable 'profile' from source: play vars 15406 1726854967.89292: variable 'interface' from source: set_fact 15406 1726854967.89295: variable 'interface' from source: set_fact 15406 1726854967.89297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854967.89432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854967.89488: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854967.89539: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854967.89598: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854967.89662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854967.89700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854967.89736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854967.89773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854967.89822: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.90100: variable 'network_connections' from source: play vars 15406 1726854967.90112: variable 'profile' from source: play vars 15406 1726854967.90231: variable 'profile' from source: play vars 15406 1726854967.90234: variable 'interface' from source: set_fact 15406 1726854967.90249: variable 'interface' from source: set_fact 15406 1726854967.90273: variable '__network_packages_default_wireless' from source: role '' defaults 15406 1726854967.90331: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854967.90516: variable 'network_connections' from source: play vars 15406 1726854967.90520: variable 'profile' from source: play vars 15406 1726854967.90571: variable 'profile' from source: play vars 15406 1726854967.90574: variable 'interface' from source: set_fact 15406 1726854967.90626: variable 'interface' from source: set_fact 15406 1726854967.90645: variable '__network_packages_default_team' from source: role '' defaults 15406 1726854967.90706: variable '__network_team_connections_defined' from source: role '' defaults 15406 1726854967.91064: variable 'network_connections' from source: play vars 15406 1726854967.91067: variable 'profile' from source: play vars 15406 1726854967.91091: variable 'profile' from source: play vars 15406 1726854967.91097: variable 'interface' from source: set_fact 15406 1726854967.91148: variable 'interface' from source: set_fact 15406 1726854967.91186: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854967.91233: variable '__network_service_name_default_initscripts' from source: role '' defaults 15406 1726854967.91240: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854967.91281: variable '__network_packages_default_initscripts' from source: role '' defaults 15406 1726854967.91418: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15406 1726854967.91940: variable 'network_connections' from source: play vars 15406 1726854967.91943: variable 'profile' from source: play vars 15406 1726854967.92172: variable 'profile' from source: play vars 15406 1726854967.92175: variable 'interface' from source: set_fact 15406 1726854967.92178: variable 'interface' from source: set_fact 15406 1726854967.92180: variable 'ansible_distribution' from source: facts 15406 1726854967.92182: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.92184: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.92186: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15406 1726854967.92359: variable 'ansible_distribution' from source: facts 15406 1726854967.92366: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.92369: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.92393: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15406 1726854967.92843: variable 'ansible_distribution' from source: facts 15406 1726854967.92847: variable '__network_rh_distros' from source: role '' defaults 15406 1726854967.92853: variable 'ansible_distribution_major_version' from source: facts 15406 1726854967.92985: variable 'network_provider' from source: set_fact 15406 1726854967.93024: variable 'omit' from source: magic vars 15406 1726854967.93062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854967.93102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854967.93294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854967.93298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854967.93301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854967.93303: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854967.93305: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.93308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.93353: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854967.93378: Set connection var ansible_timeout to 10 15406 1726854967.93390: Set connection var ansible_connection to ssh 15406 1726854967.93408: Set connection var ansible_shell_type to sh 15406 1726854967.93429: Set connection var ansible_shell_executable to /bin/sh 15406 1726854967.93447: Set connection var ansible_pipelining to False 15406 1726854967.93494: variable 'ansible_shell_executable' from source: unknown 15406 1726854967.93499: variable 'ansible_connection' from source: unknown 15406 1726854967.93503: variable 'ansible_module_compression' from source: unknown 15406 1726854967.93505: variable 'ansible_shell_type' from source: unknown 15406 1726854967.93507: variable 'ansible_shell_executable' from source: unknown 15406 1726854967.93510: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854967.93516: variable 'ansible_pipelining' from source: unknown 15406 1726854967.93518: variable 'ansible_timeout' from source: unknown 15406 1726854967.93520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854967.93600: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854967.93607: variable 'omit' from source: magic vars 15406 1726854967.93610: starting attempt loop 15406 1726854967.93613: running the handler 15406 1726854967.93669: variable 'ansible_facts' from source: unknown 15406 1726854967.94347: _low_level_execute_command(): starting 15406 1726854967.94470: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854967.95705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854967.95709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854967.95909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854967.95912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854967.95916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854967.95919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854967.96426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854967.96502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854967.98216: stdout chunk (state=3): >>>/root <<< 15406 1726854967.98334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854967.98376: stderr chunk (state=3): >>><<< 15406 1726854967.98381: stdout chunk (state=3): >>><<< 15406 1726854967.98412: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854967.98445: _low_level_execute_command(): starting 15406 1726854967.98448: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787 `" && echo ansible-tmp-1726854967.9842157-17014-22892600664787="` echo /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787 `" ) && sleep 0' 15406 1726854967.98974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854967.98978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854967.98982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854967.98985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854967.98988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854967.99034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854967.99049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854967.99217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854968.01115: stdout chunk (state=3): >>>ansible-tmp-1726854967.9842157-17014-22892600664787=/root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787 <<< 15406 1726854968.01263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854968.01267: stdout chunk (state=3): >>><<< 15406 1726854968.01269: stderr chunk (state=3): >>><<< 15406 1726854968.01294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854967.9842157-17014-22892600664787=/root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854968.01327: variable 'ansible_module_compression' from source: unknown 15406 1726854968.01492: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15406 1726854968.01495: variable 'ansible_facts' from source: unknown 15406 1726854968.01685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py 15406 1726854968.01849: Sending initial data 15406 1726854968.01858: Sent initial data (155 bytes) 15406 1726854968.02451: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854968.02502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854968.02517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854968.02602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854968.02639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854968.02715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854968.04305: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854968.04431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854968.04544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpkka0hkel /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py <<< 15406 1726854968.04570: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py" <<< 15406 1726854968.04574: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpkka0hkel" to remote "/root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py" <<< 15406 1726854968.06945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854968.07127: stderr chunk (state=3): >>><<< 15406 1726854968.07130: stdout chunk (state=3): >>><<< 15406 1726854968.07133: done transferring module to remote 15406 1726854968.07135: _low_level_execute_command(): starting 15406 1726854968.07137: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/ /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py && sleep 0' 15406 1726854968.07866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854968.07931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854968.08041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854968.08062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854968.08078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854968.08178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854968.10010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854968.10054: stdout chunk (state=3): >>><<< 15406 1726854968.10113: stderr chunk (state=3): >>><<< 15406 1726854968.10261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854968.10265: _low_level_execute_command(): starting 15406 1726854968.10267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/AnsiballZ_systemd.py && sleep 0' 15406 1726854968.11454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854968.11457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854968.11459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854968.11461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854968.11463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854968.11469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854968.11702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854968.11797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854968.40825: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4542464", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311759360", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1048413000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 15406 1726854968.40859: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15406 1726854968.42906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854968.42924: stdout chunk (state=3): >>><<< 15406 1726854968.42973: stderr chunk (state=3): >>><<< 15406 1726854968.43010: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6952", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainStartTimestampMonotonic": "534997129", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ExecMainHandoffTimestampMonotonic": "535012691", "ExecMainPID": "6952", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4542464", "MemoryPeak": "8568832", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311759360", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1048413000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service multi-user.target cloud-init.service shutdown.target", "After": "network-pre.target basic.target cloud-init-local.service dbus-broker.service dbus.socket sysinit.target systemd-journald.socket system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:54:20 EDT", "StateChangeTimestampMonotonic": "643755242", "InactiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveExitTimestampMonotonic": "534998649", "ActiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveEnterTimestampMonotonic": "535088340", "ActiveExitTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ActiveExitTimestampMonotonic": "534969671", "InactiveEnterTimestamp": "Fri 2024-09-20 13:52:31 EDT", "InactiveEnterTimestampMonotonic": "534993742", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:52:31 EDT", "ConditionTimestampMonotonic": "534995785", "AssertTimestamp": "Fri 2024-09-20 13:52:31 EDT", "AssertTimestampMonotonic": "534995794", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6b946ff91244430a922b8c06a2cf2878", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854968.43262: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854968.43297: _low_level_execute_command(): starting 15406 1726854968.43377: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854967.9842157-17014-22892600664787/ > /dev/null 2>&1 && sleep 0' 15406 1726854968.44875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854968.44931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854968.44946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854968.45000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854968.45204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854968.45276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854968.45405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854968.48003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854968.48007: stdout chunk (state=3): >>><<< 15406 1726854968.48009: stderr chunk (state=3): >>><<< 15406 1726854968.48012: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854968.48014: handler run complete 15406 1726854968.48016: attempt loop complete, returning result 15406 1726854968.48296: _execute() done 15406 1726854968.48300: dumping result to json 15406 1726854968.48303: done dumping result, returning 15406 1726854968.48305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-3c83-32d3-000000000066] 15406 1726854968.48307: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000066 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854968.48718: no more pending results, returning what we have 15406 1726854968.48723: results queue empty 15406 1726854968.48724: checking for any_errors_fatal 15406 1726854968.48734: done checking for any_errors_fatal 15406 1726854968.48735: checking for max_fail_percentage 15406 1726854968.48737: done checking for max_fail_percentage 15406 1726854968.48738: checking to see if all hosts have failed and the running result is not ok 15406 1726854968.48739: done checking to see if all hosts have failed 15406 1726854968.48740: getting the remaining hosts for this loop 15406 1726854968.48741: done getting the remaining hosts for this loop 15406 1726854968.48745: getting the next task for host managed_node2 15406 1726854968.48753: done getting next task for host managed_node2 15406 1726854968.48756: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15406 1726854968.48758: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854968.48769: getting variables 15406 1726854968.48771: in VariableManager get_vars() 15406 1726854968.48850: Calling all_inventory to load vars for managed_node2 15406 1726854968.48853: Calling groups_inventory to load vars for managed_node2 15406 1726854968.48855: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854968.48867: Calling all_plugins_play to load vars for managed_node2 15406 1726854968.48871: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854968.48874: Calling groups_plugins_play to load vars for managed_node2 15406 1726854968.50374: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000066 15406 1726854968.50377: WORKER PROCESS EXITING 15406 1726854968.52130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854968.55412: done with get_vars() 15406 1726854968.55442: done getting variables 15406 1726854968.55541: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:56:08 -0400 (0:00:00.719) 0:00:36.378 ****** 15406 1726854968.55573: entering _queue_task() for managed_node2/service 15406 1726854968.55975: worker is 1 (out of 1 available) 15406 1726854968.55992: exiting _queue_task() for managed_node2/service 15406 1726854968.56007: done queuing things up, now waiting for results queue to drain 15406 1726854968.56008: waiting for pending results... 15406 1726854968.56274: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15406 1726854968.56407: in run() - task 0affcc66-ac2b-3c83-32d3-000000000067 15406 1726854968.56429: variable 'ansible_search_path' from source: unknown 15406 1726854968.56438: variable 'ansible_search_path' from source: unknown 15406 1726854968.56491: calling self._execute() 15406 1726854968.56608: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854968.56621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854968.56635: variable 'omit' from source: magic vars 15406 1726854968.57049: variable 'ansible_distribution_major_version' from source: facts 15406 1726854968.57064: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854968.57192: variable 'network_provider' from source: set_fact 15406 1726854968.57209: Evaluated conditional (network_provider == "nm"): True 15406 1726854968.57314: variable '__network_wpa_supplicant_required' from source: role '' defaults 15406 1726854968.57538: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15406 1726854968.57977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854968.63170: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854968.63325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854968.63367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854968.63423: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854968.63527: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854968.63677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854968.63934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854968.63938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854968.63940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854968.64007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854968.64065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854968.64174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854968.64209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854968.64302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854968.64321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854968.64370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854968.64405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854968.64433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854968.64483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854968.64507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854968.64665: variable 'network_connections' from source: play vars 15406 1726854968.64682: variable 'profile' from source: play vars 15406 1726854968.64763: variable 'profile' from source: play vars 15406 1726854968.64772: variable 'interface' from source: set_fact 15406 1726854968.64847: variable 'interface' from source: set_fact 15406 1726854968.64935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15406 1726854968.65136: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15406 1726854968.65177: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15406 1726854968.65216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15406 1726854968.65257: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15406 1726854968.65310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15406 1726854968.65348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15406 1726854968.65380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854968.65458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15406 1726854968.65472: variable '__network_wireless_connections_defined' from source: role '' defaults 15406 1726854968.65732: variable 'network_connections' from source: play vars 15406 1726854968.65744: variable 'profile' from source: play vars 15406 1726854968.65816: variable 'profile' from source: play vars 15406 1726854968.65825: variable 'interface' from source: set_fact 15406 1726854968.65897: variable 'interface' from source: set_fact 15406 1726854968.66005: Evaluated conditional (__network_wpa_supplicant_required): False 15406 1726854968.66008: when evaluation is False, skipping this task 15406 1726854968.66010: _execute() done 15406 1726854968.66021: dumping result to json 15406 1726854968.66023: done dumping result, returning 15406 1726854968.66026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-3c83-32d3-000000000067] 15406 1726854968.66028: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000067 15406 1726854968.66098: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000067 15406 1726854968.66101: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15406 1726854968.66153: no more pending results, returning what we have 15406 1726854968.66157: results queue empty 15406 1726854968.66158: checking for any_errors_fatal 15406 1726854968.66172: done checking for any_errors_fatal 15406 1726854968.66172: checking for max_fail_percentage 15406 1726854968.66174: done checking for max_fail_percentage 15406 1726854968.66175: checking to see if all hosts have failed and the running result is not ok 15406 1726854968.66176: done checking to see if all hosts have failed 15406 1726854968.66176: getting the remaining hosts for this loop 15406 1726854968.66177: done getting the remaining hosts for this loop 15406 1726854968.66180: getting the next task for host managed_node2 15406 1726854968.66189: done getting next task for host managed_node2 15406 1726854968.66192: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15406 1726854968.66194: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854968.66208: getting variables 15406 1726854968.66210: in VariableManager get_vars() 15406 1726854968.66251: Calling all_inventory to load vars for managed_node2 15406 1726854968.66254: Calling groups_inventory to load vars for managed_node2 15406 1726854968.66256: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854968.66266: Calling all_plugins_play to load vars for managed_node2 15406 1726854968.66268: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854968.66271: Calling groups_plugins_play to load vars for managed_node2 15406 1726854968.68186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854968.69837: done with get_vars() 15406 1726854968.69859: done getting variables 15406 1726854968.69926: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:56:08 -0400 (0:00:00.143) 0:00:36.522 ****** 15406 1726854968.69956: entering _queue_task() for managed_node2/service 15406 1726854968.70407: worker is 1 (out of 1 available) 15406 1726854968.70420: exiting _queue_task() for managed_node2/service 15406 1726854968.70433: done queuing things up, now waiting for results queue to drain 15406 1726854968.70434: waiting for pending results... 15406 1726854968.71310: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 15406 1726854968.71318: in run() - task 0affcc66-ac2b-3c83-32d3-000000000068 15406 1726854968.71322: variable 'ansible_search_path' from source: unknown 15406 1726854968.71324: variable 'ansible_search_path' from source: unknown 15406 1726854968.71326: calling self._execute() 15406 1726854968.71798: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854968.71803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854968.71805: variable 'omit' from source: magic vars 15406 1726854968.72392: variable 'ansible_distribution_major_version' from source: facts 15406 1726854968.72415: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854968.72715: variable 'network_provider' from source: set_fact 15406 1726854968.72726: Evaluated conditional (network_provider == "initscripts"): False 15406 1726854968.72734: when evaluation is False, skipping this task 15406 1726854968.72741: _execute() done 15406 1726854968.72748: dumping result to json 15406 1726854968.72755: done dumping result, returning 15406 1726854968.72772: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-3c83-32d3-000000000068] 15406 1726854968.72992: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000068 15406 1726854968.73066: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000068 15406 1726854968.73070: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15406 1726854968.73146: no more pending results, returning what we have 15406 1726854968.73151: results queue empty 15406 1726854968.73152: checking for any_errors_fatal 15406 1726854968.73165: done checking for any_errors_fatal 15406 1726854968.73166: checking for max_fail_percentage 15406 1726854968.73168: done checking for max_fail_percentage 15406 1726854968.73169: checking to see if all hosts have failed and the running result is not ok 15406 1726854968.73170: done checking to see if all hosts have failed 15406 1726854968.73171: getting the remaining hosts for this loop 15406 1726854968.73172: done getting the remaining hosts for this loop 15406 1726854968.73176: getting the next task for host managed_node2 15406 1726854968.73183: done getting next task for host managed_node2 15406 1726854968.73189: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15406 1726854968.73193: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854968.73211: getting variables 15406 1726854968.73213: in VariableManager get_vars() 15406 1726854968.73253: Calling all_inventory to load vars for managed_node2 15406 1726854968.73256: Calling groups_inventory to load vars for managed_node2 15406 1726854968.73259: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854968.73271: Calling all_plugins_play to load vars for managed_node2 15406 1726854968.73275: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854968.73278: Calling groups_plugins_play to load vars for managed_node2 15406 1726854968.76427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854968.79434: done with get_vars() 15406 1726854968.79464: done getting variables 15406 1726854968.79735: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:56:08 -0400 (0:00:00.098) 0:00:36.620 ****** 15406 1726854968.79769: entering _queue_task() for managed_node2/copy 15406 1726854968.80547: worker is 1 (out of 1 available) 15406 1726854968.80561: exiting _queue_task() for managed_node2/copy 15406 1726854968.80573: done queuing things up, now waiting for results queue to drain 15406 1726854968.80574: waiting for pending results... 15406 1726854968.81204: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15406 1726854968.81402: in run() - task 0affcc66-ac2b-3c83-32d3-000000000069 15406 1726854968.81422: variable 'ansible_search_path' from source: unknown 15406 1726854968.81794: variable 'ansible_search_path' from source: unknown 15406 1726854968.81800: calling self._execute() 15406 1726854968.81947: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854968.82138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854968.82142: variable 'omit' from source: magic vars 15406 1726854968.82771: variable 'ansible_distribution_major_version' from source: facts 15406 1726854968.83010: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854968.83143: variable 'network_provider' from source: set_fact 15406 1726854968.83155: Evaluated conditional (network_provider == "initscripts"): False 15406 1726854968.83163: when evaluation is False, skipping this task 15406 1726854968.83171: _execute() done 15406 1726854968.83178: dumping result to json 15406 1726854968.83186: done dumping result, returning 15406 1726854968.83209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-3c83-32d3-000000000069] 15406 1726854968.83220: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000069 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15406 1726854968.83378: no more pending results, returning what we have 15406 1726854968.83382: results queue empty 15406 1726854968.83383: checking for any_errors_fatal 15406 1726854968.83392: done checking for any_errors_fatal 15406 1726854968.83393: checking for max_fail_percentage 15406 1726854968.83398: done checking for max_fail_percentage 15406 1726854968.83399: checking to see if all hosts have failed and the running result is not ok 15406 1726854968.83400: done checking to see if all hosts have failed 15406 1726854968.83400: getting the remaining hosts for this loop 15406 1726854968.83402: done getting the remaining hosts for this loop 15406 1726854968.83405: getting the next task for host managed_node2 15406 1726854968.83412: done getting next task for host managed_node2 15406 1726854968.83416: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15406 1726854968.83418: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854968.83434: getting variables 15406 1726854968.83435: in VariableManager get_vars() 15406 1726854968.83475: Calling all_inventory to load vars for managed_node2 15406 1726854968.83477: Calling groups_inventory to load vars for managed_node2 15406 1726854968.83480: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854968.83598: Calling all_plugins_play to load vars for managed_node2 15406 1726854968.83603: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854968.83607: Calling groups_plugins_play to load vars for managed_node2 15406 1726854968.84384: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000069 15406 1726854968.84389: WORKER PROCESS EXITING 15406 1726854968.91134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854968.94811: done with get_vars() 15406 1726854968.94840: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:56:08 -0400 (0:00:00.151) 0:00:36.771 ****** 15406 1726854968.94915: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15406 1726854968.95259: worker is 1 (out of 1 available) 15406 1726854968.95272: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15406 1726854968.95284: done queuing things up, now waiting for results queue to drain 15406 1726854968.95286: waiting for pending results... 15406 1726854968.95576: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15406 1726854968.95716: in run() - task 0affcc66-ac2b-3c83-32d3-00000000006a 15406 1726854968.95737: variable 'ansible_search_path' from source: unknown 15406 1726854968.95746: variable 'ansible_search_path' from source: unknown 15406 1726854968.95785: calling self._execute() 15406 1726854968.95894: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854968.95910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854968.95926: variable 'omit' from source: magic vars 15406 1726854968.96316: variable 'ansible_distribution_major_version' from source: facts 15406 1726854968.96332: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854968.96344: variable 'omit' from source: magic vars 15406 1726854968.96392: variable 'omit' from source: magic vars 15406 1726854968.96557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15406 1726854968.98722: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15406 1726854968.98809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15406 1726854968.98849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15406 1726854968.98893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15406 1726854968.98929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15406 1726854968.99091: variable 'network_provider' from source: set_fact 15406 1726854968.99160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15406 1726854968.99203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15406 1726854968.99234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15406 1726854968.99283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15406 1726854968.99315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15406 1726854968.99397: variable 'omit' from source: magic vars 15406 1726854968.99520: variable 'omit' from source: magic vars 15406 1726854968.99631: variable 'network_connections' from source: play vars 15406 1726854968.99646: variable 'profile' from source: play vars 15406 1726854968.99737: variable 'profile' from source: play vars 15406 1726854968.99741: variable 'interface' from source: set_fact 15406 1726854968.99785: variable 'interface' from source: set_fact 15406 1726854968.99934: variable 'omit' from source: magic vars 15406 1726854968.99946: variable '__lsr_ansible_managed' from source: task vars 15406 1726854969.00063: variable '__lsr_ansible_managed' from source: task vars 15406 1726854969.00193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15406 1726854969.00419: Loaded config def from plugin (lookup/template) 15406 1726854969.00429: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15406 1726854969.00459: File lookup term: get_ansible_managed.j2 15406 1726854969.00467: variable 'ansible_search_path' from source: unknown 15406 1726854969.00477: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15406 1726854969.00502: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15406 1726854969.00692: variable 'ansible_search_path' from source: unknown 15406 1726854969.08694: variable 'ansible_managed' from source: unknown 15406 1726854969.08847: variable 'omit' from source: magic vars 15406 1726854969.08881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854969.08962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854969.08966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854969.08968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.08972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.09009: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854969.09018: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.09025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.09129: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854969.09141: Set connection var ansible_timeout to 10 15406 1726854969.09148: Set connection var ansible_connection to ssh 15406 1726854969.09157: Set connection var ansible_shell_type to sh 15406 1726854969.09179: Set connection var ansible_shell_executable to /bin/sh 15406 1726854969.09181: Set connection var ansible_pipelining to False 15406 1726854969.09212: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.09290: variable 'ansible_connection' from source: unknown 15406 1726854969.09293: variable 'ansible_module_compression' from source: unknown 15406 1726854969.09299: variable 'ansible_shell_type' from source: unknown 15406 1726854969.09301: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.09304: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.09306: variable 'ansible_pipelining' from source: unknown 15406 1726854969.09308: variable 'ansible_timeout' from source: unknown 15406 1726854969.09310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.09402: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854969.09428: variable 'omit' from source: magic vars 15406 1726854969.09440: starting attempt loop 15406 1726854969.09447: running the handler 15406 1726854969.09464: _low_level_execute_command(): starting 15406 1726854969.09474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854969.10194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854969.10214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854969.10228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854969.10248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854969.10360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854969.10377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854969.10401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.10606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.12275: stdout chunk (state=3): >>>/root <<< 15406 1726854969.12384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854969.12391: stdout chunk (state=3): >>><<< 15406 1726854969.12393: stderr chunk (state=3): >>><<< 15406 1726854969.12418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854969.12690: _low_level_execute_command(): starting 15406 1726854969.12697: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803 `" && echo ansible-tmp-1726854969.1259732-17061-228961970414803="` echo /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803 `" ) && sleep 0' 15406 1726854969.13886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.14039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854969.14114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.14130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.16047: stdout chunk (state=3): >>>ansible-tmp-1726854969.1259732-17061-228961970414803=/root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803 <<< 15406 1726854969.16150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854969.16190: stderr chunk (state=3): >>><<< 15406 1726854969.16193: stdout chunk (state=3): >>><<< 15406 1726854969.16212: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854969.1259732-17061-228961970414803=/root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854969.16260: variable 'ansible_module_compression' from source: unknown 15406 1726854969.16597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15406 1726854969.16600: variable 'ansible_facts' from source: unknown 15406 1726854969.16698: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py 15406 1726854969.16871: Sending initial data 15406 1726854969.16898: Sent initial data (168 bytes) 15406 1726854969.18036: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854969.18052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854969.18107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.18176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854969.18202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854969.18219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.18327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.20030: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854969.20120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854969.20158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py" <<< 15406 1726854969.20161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpfd5re5b1 /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py <<< 15406 1726854969.20280: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpfd5re5b1" to remote "/root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py" <<< 15406 1726854969.21602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854969.21636: stderr chunk (state=3): >>><<< 15406 1726854969.21645: stdout chunk (state=3): >>><<< 15406 1726854969.21680: done transferring module to remote 15406 1726854969.21711: _low_level_execute_command(): starting 15406 1726854969.21728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/ /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py && sleep 0' 15406 1726854969.22411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.22478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854969.22505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.22611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.24460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854969.24464: stdout chunk (state=3): >>><<< 15406 1726854969.24466: stderr chunk (state=3): >>><<< 15406 1726854969.24482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854969.24493: _low_level_execute_command(): starting 15406 1726854969.24642: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/AnsiballZ_network_connections.py && sleep 0' 15406 1726854969.25601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854969.25617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.25631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.25690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854969.25693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854969.25980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.26058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.52285: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0o5zdrso/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0o5zdrso/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/2f13cc6b-f32a-48d6-a24a-0c29170576ff: error=unknown <<< 15406 1726854969.52533: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15406 1726854969.54208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854969.54220: stdout chunk (state=3): >>><<< 15406 1726854969.54530: stderr chunk (state=3): >>><<< 15406 1726854969.54534: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0o5zdrso/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0o5zdrso/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/2f13cc6b-f32a-48d6-a24a-0c29170576ff: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854969.54536: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854969.54539: _low_level_execute_command(): starting 15406 1726854969.54541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854969.1259732-17061-228961970414803/ > /dev/null 2>&1 && sleep 0' 15406 1726854969.56290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854969.56323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854969.56363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854969.56418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.56500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.56527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854969.56543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854969.56562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.56680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.58628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854969.58648: stdout chunk (state=3): >>><<< 15406 1726854969.58867: stderr chunk (state=3): >>><<< 15406 1726854969.58871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854969.58874: handler run complete 15406 1726854969.58876: attempt loop complete, returning result 15406 1726854969.58878: _execute() done 15406 1726854969.58880: dumping result to json 15406 1726854969.58882: done dumping result, returning 15406 1726854969.58890: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-3c83-32d3-00000000006a] 15406 1726854969.58892: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006a changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15406 1726854969.59390: no more pending results, returning what we have 15406 1726854969.59394: results queue empty 15406 1726854969.59395: checking for any_errors_fatal 15406 1726854969.59406: done checking for any_errors_fatal 15406 1726854969.59408: checking for max_fail_percentage 15406 1726854969.59409: done checking for max_fail_percentage 15406 1726854969.59411: checking to see if all hosts have failed and the running result is not ok 15406 1726854969.59412: done checking to see if all hosts have failed 15406 1726854969.59412: getting the remaining hosts for this loop 15406 1726854969.59414: done getting the remaining hosts for this loop 15406 1726854969.59418: getting the next task for host managed_node2 15406 1726854969.59424: done getting next task for host managed_node2 15406 1726854969.59427: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15406 1726854969.59429: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854969.59439: getting variables 15406 1726854969.59441: in VariableManager get_vars() 15406 1726854969.59481: Calling all_inventory to load vars for managed_node2 15406 1726854969.59485: Calling groups_inventory to load vars for managed_node2 15406 1726854969.59692: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854969.59712: Calling all_plugins_play to load vars for managed_node2 15406 1726854969.59717: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854969.59723: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006a 15406 1726854969.59726: WORKER PROCESS EXITING 15406 1726854969.59730: Calling groups_plugins_play to load vars for managed_node2 15406 1726854969.62276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854969.64218: done with get_vars() 15406 1726854969.64246: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:56:09 -0400 (0:00:00.694) 0:00:37.466 ****** 15406 1726854969.64335: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15406 1726854969.64806: worker is 1 (out of 1 available) 15406 1726854969.64818: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15406 1726854969.64828: done queuing things up, now waiting for results queue to drain 15406 1726854969.64830: waiting for pending results... 15406 1726854969.65143: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 15406 1726854969.65359: in run() - task 0affcc66-ac2b-3c83-32d3-00000000006b 15406 1726854969.65380: variable 'ansible_search_path' from source: unknown 15406 1726854969.65391: variable 'ansible_search_path' from source: unknown 15406 1726854969.65443: calling self._execute() 15406 1726854969.65555: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.65567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.65582: variable 'omit' from source: magic vars 15406 1726854969.65994: variable 'ansible_distribution_major_version' from source: facts 15406 1726854969.66011: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854969.66139: variable 'network_state' from source: role '' defaults 15406 1726854969.66156: Evaluated conditional (network_state != {}): False 15406 1726854969.66169: when evaluation is False, skipping this task 15406 1726854969.66177: _execute() done 15406 1726854969.66184: dumping result to json 15406 1726854969.66200: done dumping result, returning 15406 1726854969.66213: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-3c83-32d3-00000000006b] 15406 1726854969.66222: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006b 15406 1726854969.66456: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006b 15406 1726854969.66459: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15406 1726854969.66516: no more pending results, returning what we have 15406 1726854969.66525: results queue empty 15406 1726854969.66526: checking for any_errors_fatal 15406 1726854969.66540: done checking for any_errors_fatal 15406 1726854969.66541: checking for max_fail_percentage 15406 1726854969.66543: done checking for max_fail_percentage 15406 1726854969.66544: checking to see if all hosts have failed and the running result is not ok 15406 1726854969.66545: done checking to see if all hosts have failed 15406 1726854969.66546: getting the remaining hosts for this loop 15406 1726854969.66547: done getting the remaining hosts for this loop 15406 1726854969.66551: getting the next task for host managed_node2 15406 1726854969.66558: done getting next task for host managed_node2 15406 1726854969.66561: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15406 1726854969.66564: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854969.66579: getting variables 15406 1726854969.66581: in VariableManager get_vars() 15406 1726854969.66623: Calling all_inventory to load vars for managed_node2 15406 1726854969.66626: Calling groups_inventory to load vars for managed_node2 15406 1726854969.66713: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854969.66766: Calling all_plugins_play to load vars for managed_node2 15406 1726854969.66771: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854969.66775: Calling groups_plugins_play to load vars for managed_node2 15406 1726854969.68308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854969.70503: done with get_vars() 15406 1726854969.70570: done getting variables 15406 1726854969.70638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:56:09 -0400 (0:00:00.063) 0:00:37.529 ****** 15406 1726854969.70721: entering _queue_task() for managed_node2/debug 15406 1726854969.71326: worker is 1 (out of 1 available) 15406 1726854969.71342: exiting _queue_task() for managed_node2/debug 15406 1726854969.71354: done queuing things up, now waiting for results queue to drain 15406 1726854969.71356: waiting for pending results... 15406 1726854969.71954: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15406 1726854969.72031: in run() - task 0affcc66-ac2b-3c83-32d3-00000000006c 15406 1726854969.72128: variable 'ansible_search_path' from source: unknown 15406 1726854969.72153: variable 'ansible_search_path' from source: unknown 15406 1726854969.72226: calling self._execute() 15406 1726854969.72337: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.72389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.72397: variable 'omit' from source: magic vars 15406 1726854969.73046: variable 'ansible_distribution_major_version' from source: facts 15406 1726854969.73057: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854969.73096: variable 'omit' from source: magic vars 15406 1726854969.73150: variable 'omit' from source: magic vars 15406 1726854969.73199: variable 'omit' from source: magic vars 15406 1726854969.73278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854969.73307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854969.73332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854969.73353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.73404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.73475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854969.73502: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.73504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.73613: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854969.73693: Set connection var ansible_timeout to 10 15406 1726854969.73696: Set connection var ansible_connection to ssh 15406 1726854969.73698: Set connection var ansible_shell_type to sh 15406 1726854969.73700: Set connection var ansible_shell_executable to /bin/sh 15406 1726854969.73702: Set connection var ansible_pipelining to False 15406 1726854969.73703: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.73705: variable 'ansible_connection' from source: unknown 15406 1726854969.73707: variable 'ansible_module_compression' from source: unknown 15406 1726854969.73709: variable 'ansible_shell_type' from source: unknown 15406 1726854969.73711: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.73715: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.73717: variable 'ansible_pipelining' from source: unknown 15406 1726854969.73719: variable 'ansible_timeout' from source: unknown 15406 1726854969.73721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.73877: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854969.74017: variable 'omit' from source: magic vars 15406 1726854969.74025: starting attempt loop 15406 1726854969.74028: running the handler 15406 1726854969.74179: variable '__network_connections_result' from source: set_fact 15406 1726854969.74246: handler run complete 15406 1726854969.74269: attempt loop complete, returning result 15406 1726854969.74276: _execute() done 15406 1726854969.74284: dumping result to json 15406 1726854969.74296: done dumping result, returning 15406 1726854969.74309: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-3c83-32d3-00000000006c] 15406 1726854969.74318: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006c 15406 1726854969.74669: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006c 15406 1726854969.74674: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 15406 1726854969.74735: no more pending results, returning what we have 15406 1726854969.74738: results queue empty 15406 1726854969.74740: checking for any_errors_fatal 15406 1726854969.74745: done checking for any_errors_fatal 15406 1726854969.74746: checking for max_fail_percentage 15406 1726854969.74748: done checking for max_fail_percentage 15406 1726854969.74749: checking to see if all hosts have failed and the running result is not ok 15406 1726854969.74750: done checking to see if all hosts have failed 15406 1726854969.74751: getting the remaining hosts for this loop 15406 1726854969.74752: done getting the remaining hosts for this loop 15406 1726854969.74756: getting the next task for host managed_node2 15406 1726854969.74761: done getting next task for host managed_node2 15406 1726854969.74765: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15406 1726854969.74767: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854969.74776: getting variables 15406 1726854969.74778: in VariableManager get_vars() 15406 1726854969.74822: Calling all_inventory to load vars for managed_node2 15406 1726854969.74825: Calling groups_inventory to load vars for managed_node2 15406 1726854969.74828: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854969.74838: Calling all_plugins_play to load vars for managed_node2 15406 1726854969.74841: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854969.74844: Calling groups_plugins_play to load vars for managed_node2 15406 1726854969.76529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854969.79008: done with get_vars() 15406 1726854969.79062: done getting variables 15406 1726854969.79181: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:56:09 -0400 (0:00:00.085) 0:00:37.614 ****** 15406 1726854969.79225: entering _queue_task() for managed_node2/debug 15406 1726854969.79765: worker is 1 (out of 1 available) 15406 1726854969.79779: exiting _queue_task() for managed_node2/debug 15406 1726854969.79795: done queuing things up, now waiting for results queue to drain 15406 1726854969.79796: waiting for pending results... 15406 1726854969.80207: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15406 1726854969.80212: in run() - task 0affcc66-ac2b-3c83-32d3-00000000006d 15406 1726854969.80216: variable 'ansible_search_path' from source: unknown 15406 1726854969.80218: variable 'ansible_search_path' from source: unknown 15406 1726854969.80239: calling self._execute() 15406 1726854969.80342: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.80366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.80381: variable 'omit' from source: magic vars 15406 1726854969.80779: variable 'ansible_distribution_major_version' from source: facts 15406 1726854969.80797: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854969.80808: variable 'omit' from source: magic vars 15406 1726854969.80860: variable 'omit' from source: magic vars 15406 1726854969.80993: variable 'omit' from source: magic vars 15406 1726854969.81043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854969.81200: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854969.81235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854969.81256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.81271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.81312: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854969.81320: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.81327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.81452: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854969.81497: Set connection var ansible_timeout to 10 15406 1726854969.81504: Set connection var ansible_connection to ssh 15406 1726854969.81507: Set connection var ansible_shell_type to sh 15406 1726854969.81510: Set connection var ansible_shell_executable to /bin/sh 15406 1726854969.81612: Set connection var ansible_pipelining to False 15406 1726854969.81615: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.81618: variable 'ansible_connection' from source: unknown 15406 1726854969.81622: variable 'ansible_module_compression' from source: unknown 15406 1726854969.81624: variable 'ansible_shell_type' from source: unknown 15406 1726854969.81626: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.81628: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.81630: variable 'ansible_pipelining' from source: unknown 15406 1726854969.81632: variable 'ansible_timeout' from source: unknown 15406 1726854969.81634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.81828: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854969.81832: variable 'omit' from source: magic vars 15406 1726854969.81835: starting attempt loop 15406 1726854969.81837: running the handler 15406 1726854969.81839: variable '__network_connections_result' from source: set_fact 15406 1726854969.81912: variable '__network_connections_result' from source: set_fact 15406 1726854969.82031: handler run complete 15406 1726854969.82065: attempt loop complete, returning result 15406 1726854969.82075: _execute() done 15406 1726854969.82082: dumping result to json 15406 1726854969.82091: done dumping result, returning 15406 1726854969.82103: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-3c83-32d3-00000000006d] 15406 1726854969.82111: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006d ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15406 1726854969.82445: no more pending results, returning what we have 15406 1726854969.82448: results queue empty 15406 1726854969.82449: checking for any_errors_fatal 15406 1726854969.82457: done checking for any_errors_fatal 15406 1726854969.82457: checking for max_fail_percentage 15406 1726854969.82461: done checking for max_fail_percentage 15406 1726854969.82462: checking to see if all hosts have failed and the running result is not ok 15406 1726854969.82463: done checking to see if all hosts have failed 15406 1726854969.82464: getting the remaining hosts for this loop 15406 1726854969.82465: done getting the remaining hosts for this loop 15406 1726854969.82469: getting the next task for host managed_node2 15406 1726854969.82475: done getting next task for host managed_node2 15406 1726854969.82478: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15406 1726854969.82480: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854969.82505: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006d 15406 1726854969.82509: WORKER PROCESS EXITING 15406 1726854969.82520: getting variables 15406 1726854969.82522: in VariableManager get_vars() 15406 1726854969.82716: Calling all_inventory to load vars for managed_node2 15406 1726854969.82719: Calling groups_inventory to load vars for managed_node2 15406 1726854969.82722: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854969.82731: Calling all_plugins_play to load vars for managed_node2 15406 1726854969.82734: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854969.82737: Calling groups_plugins_play to load vars for managed_node2 15406 1726854969.85172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854969.88258: done with get_vars() 15406 1726854969.88286: done getting variables 15406 1726854969.88349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:56:09 -0400 (0:00:00.091) 0:00:37.706 ****** 15406 1726854969.88394: entering _queue_task() for managed_node2/debug 15406 1726854969.88953: worker is 1 (out of 1 available) 15406 1726854969.89058: exiting _queue_task() for managed_node2/debug 15406 1726854969.89110: done queuing things up, now waiting for results queue to drain 15406 1726854969.89111: waiting for pending results... 15406 1726854969.89318: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15406 1726854969.89466: in run() - task 0affcc66-ac2b-3c83-32d3-00000000006e 15406 1726854969.89470: variable 'ansible_search_path' from source: unknown 15406 1726854969.89473: variable 'ansible_search_path' from source: unknown 15406 1726854969.89494: calling self._execute() 15406 1726854969.89582: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.89589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.89686: variable 'omit' from source: magic vars 15406 1726854969.90018: variable 'ansible_distribution_major_version' from source: facts 15406 1726854969.90034: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854969.90159: variable 'network_state' from source: role '' defaults 15406 1726854969.90169: Evaluated conditional (network_state != {}): False 15406 1726854969.90172: when evaluation is False, skipping this task 15406 1726854969.90193: _execute() done 15406 1726854969.90197: dumping result to json 15406 1726854969.90200: done dumping result, returning 15406 1726854969.90203: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-3c83-32d3-00000000006e] 15406 1726854969.90206: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006e 15406 1726854969.90405: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006e 15406 1726854969.90408: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 15406 1726854969.90471: no more pending results, returning what we have 15406 1726854969.90474: results queue empty 15406 1726854969.90475: checking for any_errors_fatal 15406 1726854969.90482: done checking for any_errors_fatal 15406 1726854969.90483: checking for max_fail_percentage 15406 1726854969.90485: done checking for max_fail_percentage 15406 1726854969.90485: checking to see if all hosts have failed and the running result is not ok 15406 1726854969.90486: done checking to see if all hosts have failed 15406 1726854969.90488: getting the remaining hosts for this loop 15406 1726854969.90492: done getting the remaining hosts for this loop 15406 1726854969.90495: getting the next task for host managed_node2 15406 1726854969.90500: done getting next task for host managed_node2 15406 1726854969.90503: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15406 1726854969.90506: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854969.90557: getting variables 15406 1726854969.90559: in VariableManager get_vars() 15406 1726854969.90602: Calling all_inventory to load vars for managed_node2 15406 1726854969.90605: Calling groups_inventory to load vars for managed_node2 15406 1726854969.90608: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854969.90618: Calling all_plugins_play to load vars for managed_node2 15406 1726854969.90620: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854969.90623: Calling groups_plugins_play to load vars for managed_node2 15406 1726854969.92277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854969.93803: done with get_vars() 15406 1726854969.93824: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:56:09 -0400 (0:00:00.055) 0:00:37.761 ****** 15406 1726854969.93924: entering _queue_task() for managed_node2/ping 15406 1726854969.94256: worker is 1 (out of 1 available) 15406 1726854969.94268: exiting _queue_task() for managed_node2/ping 15406 1726854969.94280: done queuing things up, now waiting for results queue to drain 15406 1726854969.94281: waiting for pending results... 15406 1726854969.94740: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15406 1726854969.94745: in run() - task 0affcc66-ac2b-3c83-32d3-00000000006f 15406 1726854969.94749: variable 'ansible_search_path' from source: unknown 15406 1726854969.94752: variable 'ansible_search_path' from source: unknown 15406 1726854969.94755: calling self._execute() 15406 1726854969.94848: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.94852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.94862: variable 'omit' from source: magic vars 15406 1726854969.95283: variable 'ansible_distribution_major_version' from source: facts 15406 1726854969.95321: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854969.95335: variable 'omit' from source: magic vars 15406 1726854969.95386: variable 'omit' from source: magic vars 15406 1726854969.95404: variable 'omit' from source: magic vars 15406 1726854969.95444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854969.95507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854969.95544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854969.95547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.95561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854969.95608: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854969.95612: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.95614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.95746: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854969.95810: Set connection var ansible_timeout to 10 15406 1726854969.95813: Set connection var ansible_connection to ssh 15406 1726854969.95815: Set connection var ansible_shell_type to sh 15406 1726854969.95817: Set connection var ansible_shell_executable to /bin/sh 15406 1726854969.95819: Set connection var ansible_pipelining to False 15406 1726854969.95821: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.95824: variable 'ansible_connection' from source: unknown 15406 1726854969.95826: variable 'ansible_module_compression' from source: unknown 15406 1726854969.95828: variable 'ansible_shell_type' from source: unknown 15406 1726854969.95830: variable 'ansible_shell_executable' from source: unknown 15406 1726854969.95832: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854969.95835: variable 'ansible_pipelining' from source: unknown 15406 1726854969.95837: variable 'ansible_timeout' from source: unknown 15406 1726854969.95839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854969.96091: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854969.96133: variable 'omit' from source: magic vars 15406 1726854969.96137: starting attempt loop 15406 1726854969.96140: running the handler 15406 1726854969.96186: _low_level_execute_command(): starting 15406 1726854969.96191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854969.97160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854969.97240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854969.97245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854969.97248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854969.97324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854969.99045: stdout chunk (state=3): >>>/root <<< 15406 1726854969.99207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854969.99210: stdout chunk (state=3): >>><<< 15406 1726854969.99213: stderr chunk (state=3): >>><<< 15406 1726854969.99232: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854969.99249: _low_level_execute_command(): starting 15406 1726854969.99259: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812 `" && echo ansible-tmp-1726854969.9923859-17096-73638563788812="` echo /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812 `" ) && sleep 0' 15406 1726854969.99873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854969.99889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854969.99907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854969.99923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854969.99945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854970.00007: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.00065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854970.00081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.00106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.00212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.02312: stdout chunk (state=3): >>>ansible-tmp-1726854969.9923859-17096-73638563788812=/root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812 <<< 15406 1726854970.02316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.02318: stdout chunk (state=3): >>><<< 15406 1726854970.02321: stderr chunk (state=3): >>><<< 15406 1726854970.02695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854969.9923859-17096-73638563788812=/root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854970.02700: variable 'ansible_module_compression' from source: unknown 15406 1726854970.02704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15406 1726854970.02708: variable 'ansible_facts' from source: unknown 15406 1726854970.02765: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py 15406 1726854970.03017: Sending initial data 15406 1726854970.03020: Sent initial data (152 bytes) 15406 1726854970.04419: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.04435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.04534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.06073: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854970.06164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854970.06257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp9wfqnv64 /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py <<< 15406 1726854970.06284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py" <<< 15406 1726854970.06342: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp9wfqnv64" to remote "/root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py" <<< 15406 1726854970.07222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.07235: stdout chunk (state=3): >>><<< 15406 1726854970.07251: stderr chunk (state=3): >>><<< 15406 1726854970.07325: done transferring module to remote 15406 1726854970.07354: _low_level_execute_command(): starting 15406 1726854970.07371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/ /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py && sleep 0' 15406 1726854970.08458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.08615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854970.08618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.08675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854970.08678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.08734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.08806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.10544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.10569: stderr chunk (state=3): >>><<< 15406 1726854970.10606: stdout chunk (state=3): >>><<< 15406 1726854970.10905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854970.10913: _low_level_execute_command(): starting 15406 1726854970.10916: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/AnsiballZ_ping.py && sleep 0' 15406 1726854970.12008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854970.12082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.12303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.12417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.27252: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15406 1726854970.28538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854970.28542: stdout chunk (state=3): >>><<< 15406 1726854970.28547: stderr chunk (state=3): >>><<< 15406 1726854970.28567: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854970.28603: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854970.28611: _low_level_execute_command(): starting 15406 1726854970.28802: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854969.9923859-17096-73638563788812/ > /dev/null 2>&1 && sleep 0' 15406 1726854970.29296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854970.29310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854970.29321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854970.29335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854970.29348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854970.29444: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.29694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.29904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.31645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.31760: stderr chunk (state=3): >>><<< 15406 1726854970.31763: stdout chunk (state=3): >>><<< 15406 1726854970.31766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854970.31768: handler run complete 15406 1726854970.31770: attempt loop complete, returning result 15406 1726854970.31772: _execute() done 15406 1726854970.31774: dumping result to json 15406 1726854970.31776: done dumping result, returning 15406 1726854970.31778: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-3c83-32d3-00000000006f] 15406 1726854970.31780: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006f 15406 1726854970.31858: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000006f 15406 1726854970.31860: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 15406 1726854970.31945: no more pending results, returning what we have 15406 1726854970.31948: results queue empty 15406 1726854970.31949: checking for any_errors_fatal 15406 1726854970.31956: done checking for any_errors_fatal 15406 1726854970.31956: checking for max_fail_percentage 15406 1726854970.31958: done checking for max_fail_percentage 15406 1726854970.31959: checking to see if all hosts have failed and the running result is not ok 15406 1726854970.31960: done checking to see if all hosts have failed 15406 1726854970.31960: getting the remaining hosts for this loop 15406 1726854970.31962: done getting the remaining hosts for this loop 15406 1726854970.31965: getting the next task for host managed_node2 15406 1726854970.31973: done getting next task for host managed_node2 15406 1726854970.31975: ^ task is: TASK: meta (role_complete) 15406 1726854970.31977: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854970.31985: getting variables 15406 1726854970.31989: in VariableManager get_vars() 15406 1726854970.32027: Calling all_inventory to load vars for managed_node2 15406 1726854970.32029: Calling groups_inventory to load vars for managed_node2 15406 1726854970.32031: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854970.32041: Calling all_plugins_play to load vars for managed_node2 15406 1726854970.32044: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854970.32046: Calling groups_plugins_play to load vars for managed_node2 15406 1726854970.35765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854970.38734: done with get_vars() 15406 1726854970.38762: done getting variables 15406 1726854970.38865: done queuing things up, now waiting for results queue to drain 15406 1726854970.38868: results queue empty 15406 1726854970.38869: checking for any_errors_fatal 15406 1726854970.38872: done checking for any_errors_fatal 15406 1726854970.38872: checking for max_fail_percentage 15406 1726854970.38873: done checking for max_fail_percentage 15406 1726854970.38874: checking to see if all hosts have failed and the running result is not ok 15406 1726854970.38875: done checking to see if all hosts have failed 15406 1726854970.38876: getting the remaining hosts for this loop 15406 1726854970.38882: done getting the remaining hosts for this loop 15406 1726854970.38886: getting the next task for host managed_node2 15406 1726854970.38892: done getting next task for host managed_node2 15406 1726854970.38894: ^ task is: TASK: meta (flush_handlers) 15406 1726854970.38898: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854970.38901: getting variables 15406 1726854970.38902: in VariableManager get_vars() 15406 1726854970.38915: Calling all_inventory to load vars for managed_node2 15406 1726854970.38917: Calling groups_inventory to load vars for managed_node2 15406 1726854970.38919: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854970.38931: Calling all_plugins_play to load vars for managed_node2 15406 1726854970.38934: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854970.38938: Calling groups_plugins_play to load vars for managed_node2 15406 1726854970.41320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854970.44132: done with get_vars() 15406 1726854970.44160: done getting variables 15406 1726854970.44224: in VariableManager get_vars() 15406 1726854970.44238: Calling all_inventory to load vars for managed_node2 15406 1726854970.44240: Calling groups_inventory to load vars for managed_node2 15406 1726854970.44243: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854970.44248: Calling all_plugins_play to load vars for managed_node2 15406 1726854970.44250: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854970.44253: Calling groups_plugins_play to load vars for managed_node2 15406 1726854970.45880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854970.48890: done with get_vars() 15406 1726854970.48919: done queuing things up, now waiting for results queue to drain 15406 1726854970.48921: results queue empty 15406 1726854970.48922: checking for any_errors_fatal 15406 1726854970.48924: done checking for any_errors_fatal 15406 1726854970.48924: checking for max_fail_percentage 15406 1726854970.48925: done checking for max_fail_percentage 15406 1726854970.48927: checking to see if all hosts have failed and the running result is not ok 15406 1726854970.48927: done checking to see if all hosts have failed 15406 1726854970.48928: getting the remaining hosts for this loop 15406 1726854970.48929: done getting the remaining hosts for this loop 15406 1726854970.48935: getting the next task for host managed_node2 15406 1726854970.48938: done getting next task for host managed_node2 15406 1726854970.48939: ^ task is: TASK: meta (flush_handlers) 15406 1726854970.48940: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854970.48942: getting variables 15406 1726854970.48943: in VariableManager get_vars() 15406 1726854970.48963: Calling all_inventory to load vars for managed_node2 15406 1726854970.48965: Calling groups_inventory to load vars for managed_node2 15406 1726854970.48967: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854970.48972: Calling all_plugins_play to load vars for managed_node2 15406 1726854970.48973: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854970.48975: Calling groups_plugins_play to load vars for managed_node2 15406 1726854970.49833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854970.51269: done with get_vars() 15406 1726854970.51292: done getting variables 15406 1726854970.51349: in VariableManager get_vars() 15406 1726854970.51364: Calling all_inventory to load vars for managed_node2 15406 1726854970.51367: Calling groups_inventory to load vars for managed_node2 15406 1726854970.51369: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854970.51375: Calling all_plugins_play to load vars for managed_node2 15406 1726854970.51377: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854970.51380: Calling groups_plugins_play to load vars for managed_node2 15406 1726854970.52434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854970.53638: done with get_vars() 15406 1726854970.53661: done queuing things up, now waiting for results queue to drain 15406 1726854970.53662: results queue empty 15406 1726854970.53663: checking for any_errors_fatal 15406 1726854970.53664: done checking for any_errors_fatal 15406 1726854970.53664: checking for max_fail_percentage 15406 1726854970.53665: done checking for max_fail_percentage 15406 1726854970.53665: checking to see if all hosts have failed and the running result is not ok 15406 1726854970.53666: done checking to see if all hosts have failed 15406 1726854970.53666: getting the remaining hosts for this loop 15406 1726854970.53667: done getting the remaining hosts for this loop 15406 1726854970.53672: getting the next task for host managed_node2 15406 1726854970.53674: done getting next task for host managed_node2 15406 1726854970.53675: ^ task is: None 15406 1726854970.53676: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854970.53676: done queuing things up, now waiting for results queue to drain 15406 1726854970.53677: results queue empty 15406 1726854970.53677: checking for any_errors_fatal 15406 1726854970.53678: done checking for any_errors_fatal 15406 1726854970.53678: checking for max_fail_percentage 15406 1726854970.53679: done checking for max_fail_percentage 15406 1726854970.53679: checking to see if all hosts have failed and the running result is not ok 15406 1726854970.53679: done checking to see if all hosts have failed 15406 1726854970.53680: getting the next task for host managed_node2 15406 1726854970.53681: done getting next task for host managed_node2 15406 1726854970.53682: ^ task is: None 15406 1726854970.53682: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854970.53724: in VariableManager get_vars() 15406 1726854970.53741: done with get_vars() 15406 1726854970.53745: in VariableManager get_vars() 15406 1726854970.53752: done with get_vars() 15406 1726854970.53758: variable 'omit' from source: magic vars 15406 1726854970.53868: variable 'task' from source: play vars 15406 1726854970.53894: in VariableManager get_vars() 15406 1726854970.53902: done with get_vars() 15406 1726854970.53915: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15406 1726854970.54134: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854970.54155: getting the remaining hosts for this loop 15406 1726854970.54156: done getting the remaining hosts for this loop 15406 1726854970.54159: getting the next task for host managed_node2 15406 1726854970.54161: done getting next task for host managed_node2 15406 1726854970.54165: ^ task is: TASK: Gathering Facts 15406 1726854970.54167: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854970.54171: getting variables 15406 1726854970.54172: in VariableManager get_vars() 15406 1726854970.54182: Calling all_inventory to load vars for managed_node2 15406 1726854970.54184: Calling groups_inventory to load vars for managed_node2 15406 1726854970.54188: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854970.54194: Calling all_plugins_play to load vars for managed_node2 15406 1726854970.54199: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854970.54202: Calling groups_plugins_play to load vars for managed_node2 15406 1726854970.55380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854970.56940: done with get_vars() 15406 1726854970.56959: done getting variables 15406 1726854970.57006: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:56:10 -0400 (0:00:00.631) 0:00:38.393 ****** 15406 1726854970.57031: entering _queue_task() for managed_node2/gather_facts 15406 1726854970.57374: worker is 1 (out of 1 available) 15406 1726854970.57386: exiting _queue_task() for managed_node2/gather_facts 15406 1726854970.57400: done queuing things up, now waiting for results queue to drain 15406 1726854970.57401: waiting for pending results... 15406 1726854970.57643: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854970.57793: in run() - task 0affcc66-ac2b-3c83-32d3-00000000046e 15406 1726854970.57798: variable 'ansible_search_path' from source: unknown 15406 1726854970.57827: calling self._execute() 15406 1726854970.57993: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854970.57997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854970.58001: variable 'omit' from source: magic vars 15406 1726854970.58395: variable 'ansible_distribution_major_version' from source: facts 15406 1726854970.58412: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854970.58423: variable 'omit' from source: magic vars 15406 1726854970.58456: variable 'omit' from source: magic vars 15406 1726854970.58577: variable 'omit' from source: magic vars 15406 1726854970.58580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854970.58597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854970.58624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854970.58647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854970.58664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854970.58709: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854970.58718: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854970.58726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854970.58834: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854970.58848: Set connection var ansible_timeout to 10 15406 1726854970.58856: Set connection var ansible_connection to ssh 15406 1726854970.58867: Set connection var ansible_shell_type to sh 15406 1726854970.58884: Set connection var ansible_shell_executable to /bin/sh 15406 1726854970.58996: Set connection var ansible_pipelining to False 15406 1726854970.59000: variable 'ansible_shell_executable' from source: unknown 15406 1726854970.59002: variable 'ansible_connection' from source: unknown 15406 1726854970.59004: variable 'ansible_module_compression' from source: unknown 15406 1726854970.59008: variable 'ansible_shell_type' from source: unknown 15406 1726854970.59010: variable 'ansible_shell_executable' from source: unknown 15406 1726854970.59014: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854970.59016: variable 'ansible_pipelining' from source: unknown 15406 1726854970.59019: variable 'ansible_timeout' from source: unknown 15406 1726854970.59021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854970.59197: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854970.59214: variable 'omit' from source: magic vars 15406 1726854970.59224: starting attempt loop 15406 1726854970.59233: running the handler 15406 1726854970.59256: variable 'ansible_facts' from source: unknown 15406 1726854970.59351: _low_level_execute_command(): starting 15406 1726854970.59354: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854970.60015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854970.60031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854970.60045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854970.60069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854970.60086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854970.60176: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.60211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854970.60231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.60295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.60457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.62166: stdout chunk (state=3): >>>/root <<< 15406 1726854970.62302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.62329: stderr chunk (state=3): >>><<< 15406 1726854970.62338: stdout chunk (state=3): >>><<< 15406 1726854970.62371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854970.62477: _low_level_execute_command(): starting 15406 1726854970.62499: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768 `" && echo ansible-tmp-1726854970.6238472-17126-262630732269768="` echo /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768 `" ) && sleep 0' 15406 1726854970.63015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854970.63030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854970.63046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854970.63064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854970.63175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854970.63191: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.63218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.63315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.65236: stdout chunk (state=3): >>>ansible-tmp-1726854970.6238472-17126-262630732269768=/root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768 <<< 15406 1726854970.65374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.65396: stdout chunk (state=3): >>><<< 15406 1726854970.65411: stderr chunk (state=3): >>><<< 15406 1726854970.65438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854970.6238472-17126-262630732269768=/root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854970.65575: variable 'ansible_module_compression' from source: unknown 15406 1726854970.65779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854970.65876: variable 'ansible_facts' from source: unknown 15406 1726854970.66082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py 15406 1726854970.66648: Sending initial data 15406 1726854970.66654: Sent initial data (154 bytes) 15406 1726854970.67429: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.67443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.67456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.67556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.69231: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854970.69531: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854970.69559: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpkp03sh05 /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py <<< 15406 1726854970.69727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py" <<< 15406 1726854970.69731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpkp03sh05" to remote "/root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py" <<< 15406 1726854970.71459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.71515: stderr chunk (state=3): >>><<< 15406 1726854970.71531: stdout chunk (state=3): >>><<< 15406 1726854970.71561: done transferring module to remote 15406 1726854970.71586: _low_level_execute_command(): starting 15406 1726854970.71604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/ /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py && sleep 0' 15406 1726854970.73006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854970.73116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.73504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.73635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854970.75466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854970.75478: stdout chunk (state=3): >>><<< 15406 1726854970.75494: stderr chunk (state=3): >>><<< 15406 1726854970.75516: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854970.75523: _low_level_execute_command(): starting 15406 1726854970.75532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/AnsiballZ_setup.py && sleep 0' 15406 1726854970.76904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.76978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854970.76991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854970.77050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854970.77061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854970.77106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854970.77180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854971.40280: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "11", "epoch": "1726854971", "epoch_int": "1726854971", "date": "2024-09-20", "time": "13:56:11", "iso8601_micro": "2024-09-20T17:56:11.044751Z", "iso8601": "2024-09-20T17:56:11Z", "iso8601_basic": "20240920T135611044751", "iso8601_basic_short": "20240920T135611", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.302734375, "5m": 0.33642578125, "15m": 0.17578125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansi<<< 15406 1726854971.40347: stdout chunk (state=3): >>>ble_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 754, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795061760, "block_size": 4096, "block_total": 65519099, "block_available": 63914810, "block_used": 1604289, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854971.42183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854971.42215: stderr chunk (state=3): >>><<< 15406 1726854971.42219: stdout chunk (state=3): >>><<< 15406 1726854971.42247: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "11", "epoch": "1726854971", "epoch_int": "1726854971", "date": "2024-09-20", "time": "13:56:11", "iso8601_micro": "2024-09-20T17:56:11.044751Z", "iso8601": "2024-09-20T17:56:11Z", "iso8601_basic": "20240920T135611044751", "iso8601_basic_short": "20240920T135611", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.302734375, "5m": 0.33642578125, "15m": 0.17578125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 754, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795061760, "block_size": 4096, "block_total": 65519099, "block_available": 63914810, "block_used": 1604289, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854971.42472: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854971.42491: _low_level_execute_command(): starting 15406 1726854971.42594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854970.6238472-17126-262630732269768/ > /dev/null 2>&1 && sleep 0' 15406 1726854971.43385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854971.43405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854971.43424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854971.43446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854971.43509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854971.43578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854971.43600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854971.43633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854971.43903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854971.45674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854971.45709: stderr chunk (state=3): >>><<< 15406 1726854971.45762: stdout chunk (state=3): >>><<< 15406 1726854971.45766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854971.45769: handler run complete 15406 1726854971.45916: variable 'ansible_facts' from source: unknown 15406 1726854971.46209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.46948: variable 'ansible_facts' from source: unknown 15406 1726854971.46951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.47125: attempt loop complete, returning result 15406 1726854971.47134: _execute() done 15406 1726854971.47142: dumping result to json 15406 1726854971.47174: done dumping result, returning 15406 1726854971.47190: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-00000000046e] 15406 1726854971.47206: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000046e 15406 1726854971.48101: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000046e 15406 1726854971.48105: WORKER PROCESS EXITING ok: [managed_node2] 15406 1726854971.48426: no more pending results, returning what we have 15406 1726854971.48430: results queue empty 15406 1726854971.48431: checking for any_errors_fatal 15406 1726854971.48432: done checking for any_errors_fatal 15406 1726854971.48434: checking for max_fail_percentage 15406 1726854971.48436: done checking for max_fail_percentage 15406 1726854971.48437: checking to see if all hosts have failed and the running result is not ok 15406 1726854971.48438: done checking to see if all hosts have failed 15406 1726854971.48439: getting the remaining hosts for this loop 15406 1726854971.48440: done getting the remaining hosts for this loop 15406 1726854971.48443: getting the next task for host managed_node2 15406 1726854971.48464: done getting next task for host managed_node2 15406 1726854971.48465: ^ task is: TASK: meta (flush_handlers) 15406 1726854971.48467: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854971.48471: getting variables 15406 1726854971.48473: in VariableManager get_vars() 15406 1726854971.48509: Calling all_inventory to load vars for managed_node2 15406 1726854971.48512: Calling groups_inventory to load vars for managed_node2 15406 1726854971.48516: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.48526: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.48528: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.48531: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.49940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.51615: done with get_vars() 15406 1726854971.51644: done getting variables 15406 1726854971.51726: in VariableManager get_vars() 15406 1726854971.51738: Calling all_inventory to load vars for managed_node2 15406 1726854971.51740: Calling groups_inventory to load vars for managed_node2 15406 1726854971.51743: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.51748: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.51751: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.51754: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.53158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.54896: done with get_vars() 15406 1726854971.54929: done queuing things up, now waiting for results queue to drain 15406 1726854971.54931: results queue empty 15406 1726854971.54932: checking for any_errors_fatal 15406 1726854971.54936: done checking for any_errors_fatal 15406 1726854971.54941: checking for max_fail_percentage 15406 1726854971.54942: done checking for max_fail_percentage 15406 1726854971.54943: checking to see if all hosts have failed and the running result is not ok 15406 1726854971.54943: done checking to see if all hosts have failed 15406 1726854971.54944: getting the remaining hosts for this loop 15406 1726854971.54945: done getting the remaining hosts for this loop 15406 1726854971.54948: getting the next task for host managed_node2 15406 1726854971.54952: done getting next task for host managed_node2 15406 1726854971.54955: ^ task is: TASK: Include the task '{{ task }}' 15406 1726854971.54957: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854971.54959: getting variables 15406 1726854971.54960: in VariableManager get_vars() 15406 1726854971.54970: Calling all_inventory to load vars for managed_node2 15406 1726854971.54973: Calling groups_inventory to load vars for managed_node2 15406 1726854971.54975: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.54981: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.54983: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.54986: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.56172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.58440: done with get_vars() 15406 1726854971.58474: done getting variables 15406 1726854971.58653: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:56:11 -0400 (0:00:01.016) 0:00:39.409 ****** 15406 1726854971.58692: entering _queue_task() for managed_node2/include_tasks 15406 1726854971.59052: worker is 1 (out of 1 available) 15406 1726854971.59064: exiting _queue_task() for managed_node2/include_tasks 15406 1726854971.59075: done queuing things up, now waiting for results queue to drain 15406 1726854971.59077: waiting for pending results... 15406 1726854971.59525: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_absent.yml' 15406 1726854971.59530: in run() - task 0affcc66-ac2b-3c83-32d3-000000000073 15406 1726854971.59594: variable 'ansible_search_path' from source: unknown 15406 1726854971.59598: calling self._execute() 15406 1726854971.59675: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.59684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.59701: variable 'omit' from source: magic vars 15406 1726854971.60645: variable 'ansible_distribution_major_version' from source: facts 15406 1726854971.60648: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854971.60651: variable 'task' from source: play vars 15406 1726854971.60653: variable 'task' from source: play vars 15406 1726854971.60792: _execute() done 15406 1726854971.60795: dumping result to json 15406 1726854971.60798: done dumping result, returning 15406 1726854971.60800: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_absent.yml' [0affcc66-ac2b-3c83-32d3-000000000073] 15406 1726854971.60802: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000073 15406 1726854971.61094: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000073 15406 1726854971.61097: WORKER PROCESS EXITING 15406 1726854971.61121: no more pending results, returning what we have 15406 1726854971.61126: in VariableManager get_vars() 15406 1726854971.61156: Calling all_inventory to load vars for managed_node2 15406 1726854971.61158: Calling groups_inventory to load vars for managed_node2 15406 1726854971.61162: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.61172: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.61175: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.61178: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.62658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.64208: done with get_vars() 15406 1726854971.64229: variable 'ansible_search_path' from source: unknown 15406 1726854971.64245: we have included files to process 15406 1726854971.64246: generating all_blocks data 15406 1726854971.64247: done generating all_blocks data 15406 1726854971.64248: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15406 1726854971.64249: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15406 1726854971.64251: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15406 1726854971.64415: in VariableManager get_vars() 15406 1726854971.64431: done with get_vars() 15406 1726854971.64540: done processing included file 15406 1726854971.64542: iterating over new_blocks loaded from include file 15406 1726854971.64543: in VariableManager get_vars() 15406 1726854971.64554: done with get_vars() 15406 1726854971.64555: filtering new block on tags 15406 1726854971.64571: done filtering new block on tags 15406 1726854971.64574: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 15406 1726854971.64578: extending task lists for all hosts with included blocks 15406 1726854971.64608: done extending task lists 15406 1726854971.64610: done processing included files 15406 1726854971.64611: results queue empty 15406 1726854971.64611: checking for any_errors_fatal 15406 1726854971.64613: done checking for any_errors_fatal 15406 1726854971.64614: checking for max_fail_percentage 15406 1726854971.64615: done checking for max_fail_percentage 15406 1726854971.64616: checking to see if all hosts have failed and the running result is not ok 15406 1726854971.64616: done checking to see if all hosts have failed 15406 1726854971.64617: getting the remaining hosts for this loop 15406 1726854971.64618: done getting the remaining hosts for this loop 15406 1726854971.64620: getting the next task for host managed_node2 15406 1726854971.64624: done getting next task for host managed_node2 15406 1726854971.64626: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15406 1726854971.64628: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854971.64631: getting variables 15406 1726854971.64632: in VariableManager get_vars() 15406 1726854971.64639: Calling all_inventory to load vars for managed_node2 15406 1726854971.64641: Calling groups_inventory to load vars for managed_node2 15406 1726854971.64644: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.64649: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.64651: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.64654: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.65795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.67272: done with get_vars() 15406 1726854971.67301: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:56:11 -0400 (0:00:00.086) 0:00:39.496 ****** 15406 1726854971.67385: entering _queue_task() for managed_node2/include_tasks 15406 1726854971.68133: worker is 1 (out of 1 available) 15406 1726854971.68147: exiting _queue_task() for managed_node2/include_tasks 15406 1726854971.68161: done queuing things up, now waiting for results queue to drain 15406 1726854971.68162: waiting for pending results... 15406 1726854971.68684: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 15406 1726854971.69059: in run() - task 0affcc66-ac2b-3c83-32d3-00000000047f 15406 1726854971.69064: variable 'ansible_search_path' from source: unknown 15406 1726854971.69067: variable 'ansible_search_path' from source: unknown 15406 1726854971.69073: calling self._execute() 15406 1726854971.69545: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.69548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.69551: variable 'omit' from source: magic vars 15406 1726854971.70122: variable 'ansible_distribution_major_version' from source: facts 15406 1726854971.70138: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854971.70148: _execute() done 15406 1726854971.70155: dumping result to json 15406 1726854971.70167: done dumping result, returning 15406 1726854971.70178: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-3c83-32d3-00000000047f] 15406 1726854971.70186: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000047f 15406 1726854971.70325: no more pending results, returning what we have 15406 1726854971.70331: in VariableManager get_vars() 15406 1726854971.70365: Calling all_inventory to load vars for managed_node2 15406 1726854971.70367: Calling groups_inventory to load vars for managed_node2 15406 1726854971.70371: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.70383: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.70386: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.70391: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.71004: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000047f 15406 1726854971.71008: WORKER PROCESS EXITING 15406 1726854971.71951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.73762: done with get_vars() 15406 1726854971.73806: variable 'ansible_search_path' from source: unknown 15406 1726854971.73808: variable 'ansible_search_path' from source: unknown 15406 1726854971.73819: variable 'task' from source: play vars 15406 1726854971.73937: variable 'task' from source: play vars 15406 1726854971.73971: we have included files to process 15406 1726854971.73972: generating all_blocks data 15406 1726854971.73974: done generating all_blocks data 15406 1726854971.73975: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15406 1726854971.73976: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15406 1726854971.73979: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15406 1726854971.74935: done processing included file 15406 1726854971.74937: iterating over new_blocks loaded from include file 15406 1726854971.74938: in VariableManager get_vars() 15406 1726854971.74950: done with get_vars() 15406 1726854971.74952: filtering new block on tags 15406 1726854971.74976: done filtering new block on tags 15406 1726854971.74979: in VariableManager get_vars() 15406 1726854971.75015: done with get_vars() 15406 1726854971.75018: filtering new block on tags 15406 1726854971.75042: done filtering new block on tags 15406 1726854971.75062: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 15406 1726854971.75068: extending task lists for all hosts with included blocks 15406 1726854971.75159: done extending task lists 15406 1726854971.75160: done processing included files 15406 1726854971.75161: results queue empty 15406 1726854971.75162: checking for any_errors_fatal 15406 1726854971.75165: done checking for any_errors_fatal 15406 1726854971.75166: checking for max_fail_percentage 15406 1726854971.75167: done checking for max_fail_percentage 15406 1726854971.75168: checking to see if all hosts have failed and the running result is not ok 15406 1726854971.75168: done checking to see if all hosts have failed 15406 1726854971.75169: getting the remaining hosts for this loop 15406 1726854971.75170: done getting the remaining hosts for this loop 15406 1726854971.75172: getting the next task for host managed_node2 15406 1726854971.75176: done getting next task for host managed_node2 15406 1726854971.75178: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15406 1726854971.75181: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854971.75183: getting variables 15406 1726854971.75184: in VariableManager get_vars() 15406 1726854971.75198: Calling all_inventory to load vars for managed_node2 15406 1726854971.75201: Calling groups_inventory to load vars for managed_node2 15406 1726854971.75203: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.75209: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.75211: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.75213: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.76508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.83911: done with get_vars() 15406 1726854971.83936: done getting variables 15406 1726854971.84005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:56:11 -0400 (0:00:00.166) 0:00:39.663 ****** 15406 1726854971.84028: entering _queue_task() for managed_node2/set_fact 15406 1726854971.84822: worker is 1 (out of 1 available) 15406 1726854971.84832: exiting _queue_task() for managed_node2/set_fact 15406 1726854971.84843: done queuing things up, now waiting for results queue to drain 15406 1726854971.84845: waiting for pending results... 15406 1726854971.85094: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 15406 1726854971.85300: in run() - task 0affcc66-ac2b-3c83-32d3-00000000048a 15406 1726854971.85324: variable 'ansible_search_path' from source: unknown 15406 1726854971.85332: variable 'ansible_search_path' from source: unknown 15406 1726854971.85380: calling self._execute() 15406 1726854971.85542: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.85561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.85685: variable 'omit' from source: magic vars 15406 1726854971.86342: variable 'ansible_distribution_major_version' from source: facts 15406 1726854971.86358: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854971.86375: variable 'omit' from source: magic vars 15406 1726854971.86455: variable 'omit' from source: magic vars 15406 1726854971.86503: variable 'omit' from source: magic vars 15406 1726854971.86548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854971.86593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854971.86624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854971.86655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854971.86678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854971.86720: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854971.86729: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.86774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.86851: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854971.86863: Set connection var ansible_timeout to 10 15406 1726854971.86870: Set connection var ansible_connection to ssh 15406 1726854971.86884: Set connection var ansible_shell_type to sh 15406 1726854971.86898: Set connection var ansible_shell_executable to /bin/sh 15406 1726854971.86910: Set connection var ansible_pipelining to False 15406 1726854971.86992: variable 'ansible_shell_executable' from source: unknown 15406 1726854971.86998: variable 'ansible_connection' from source: unknown 15406 1726854971.87001: variable 'ansible_module_compression' from source: unknown 15406 1726854971.87004: variable 'ansible_shell_type' from source: unknown 15406 1726854971.87006: variable 'ansible_shell_executable' from source: unknown 15406 1726854971.87008: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.87010: variable 'ansible_pipelining' from source: unknown 15406 1726854971.87013: variable 'ansible_timeout' from source: unknown 15406 1726854971.87015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.87148: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854971.87166: variable 'omit' from source: magic vars 15406 1726854971.87211: starting attempt loop 15406 1726854971.87215: running the handler 15406 1726854971.87217: handler run complete 15406 1726854971.87219: attempt loop complete, returning result 15406 1726854971.87222: _execute() done 15406 1726854971.87228: dumping result to json 15406 1726854971.87235: done dumping result, returning 15406 1726854971.87246: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-3c83-32d3-00000000048a] 15406 1726854971.87254: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048a 15406 1726854971.87540: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048a 15406 1726854971.87543: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15406 1726854971.87602: no more pending results, returning what we have 15406 1726854971.87606: results queue empty 15406 1726854971.87607: checking for any_errors_fatal 15406 1726854971.87610: done checking for any_errors_fatal 15406 1726854971.87610: checking for max_fail_percentage 15406 1726854971.87612: done checking for max_fail_percentage 15406 1726854971.87613: checking to see if all hosts have failed and the running result is not ok 15406 1726854971.87614: done checking to see if all hosts have failed 15406 1726854971.87615: getting the remaining hosts for this loop 15406 1726854971.87616: done getting the remaining hosts for this loop 15406 1726854971.87620: getting the next task for host managed_node2 15406 1726854971.87627: done getting next task for host managed_node2 15406 1726854971.87630: ^ task is: TASK: Stat profile file 15406 1726854971.87634: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854971.87638: getting variables 15406 1726854971.87640: in VariableManager get_vars() 15406 1726854971.87690: Calling all_inventory to load vars for managed_node2 15406 1726854971.87694: Calling groups_inventory to load vars for managed_node2 15406 1726854971.87700: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854971.87713: Calling all_plugins_play to load vars for managed_node2 15406 1726854971.87716: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854971.87719: Calling groups_plugins_play to load vars for managed_node2 15406 1726854971.90665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854971.92778: done with get_vars() 15406 1726854971.92831: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:56:11 -0400 (0:00:00.089) 0:00:39.752 ****** 15406 1726854971.92938: entering _queue_task() for managed_node2/stat 15406 1726854971.93338: worker is 1 (out of 1 available) 15406 1726854971.93351: exiting _queue_task() for managed_node2/stat 15406 1726854971.93362: done queuing things up, now waiting for results queue to drain 15406 1726854971.93363: waiting for pending results... 15406 1726854971.93761: running TaskExecutor() for managed_node2/TASK: Stat profile file 15406 1726854971.93819: in run() - task 0affcc66-ac2b-3c83-32d3-00000000048b 15406 1726854971.93837: variable 'ansible_search_path' from source: unknown 15406 1726854971.93844: variable 'ansible_search_path' from source: unknown 15406 1726854971.93917: calling self._execute() 15406 1726854971.94100: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.94137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.94152: variable 'omit' from source: magic vars 15406 1726854971.94708: variable 'ansible_distribution_major_version' from source: facts 15406 1726854971.94893: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854971.94897: variable 'omit' from source: magic vars 15406 1726854971.94899: variable 'omit' from source: magic vars 15406 1726854971.94986: variable 'profile' from source: play vars 15406 1726854971.95000: variable 'interface' from source: set_fact 15406 1726854971.95165: variable 'interface' from source: set_fact 15406 1726854971.95212: variable 'omit' from source: magic vars 15406 1726854971.95317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854971.95417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854971.95500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854971.95523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854971.95673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854971.95679: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854971.95682: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.95685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.96110: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854971.96114: Set connection var ansible_timeout to 10 15406 1726854971.96117: Set connection var ansible_connection to ssh 15406 1726854971.96122: Set connection var ansible_shell_type to sh 15406 1726854971.96126: Set connection var ansible_shell_executable to /bin/sh 15406 1726854971.96129: Set connection var ansible_pipelining to False 15406 1726854971.96131: variable 'ansible_shell_executable' from source: unknown 15406 1726854971.96133: variable 'ansible_connection' from source: unknown 15406 1726854971.96135: variable 'ansible_module_compression' from source: unknown 15406 1726854971.96139: variable 'ansible_shell_type' from source: unknown 15406 1726854971.96142: variable 'ansible_shell_executable' from source: unknown 15406 1726854971.96144: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854971.96146: variable 'ansible_pipelining' from source: unknown 15406 1726854971.96148: variable 'ansible_timeout' from source: unknown 15406 1726854971.96150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854971.96512: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854971.96552: variable 'omit' from source: magic vars 15406 1726854971.96565: starting attempt loop 15406 1726854971.96570: running the handler 15406 1726854971.96585: _low_level_execute_command(): starting 15406 1726854971.96597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854971.98170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854971.98312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854971.98344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854971.98425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.00147: stdout chunk (state=3): >>>/root <<< 15406 1726854972.00282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.00309: stdout chunk (state=3): >>><<< 15406 1726854972.00312: stderr chunk (state=3): >>><<< 15406 1726854972.00329: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.00392: _low_level_execute_command(): starting 15406 1726854972.00397: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288 `" && echo ansible-tmp-1726854972.003354-17175-125098288889288="` echo /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288 `" ) && sleep 0' 15406 1726854972.01044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.01063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854972.01151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.01176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.01192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.01301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.03207: stdout chunk (state=3): >>>ansible-tmp-1726854972.003354-17175-125098288889288=/root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288 <<< 15406 1726854972.03350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.03365: stdout chunk (state=3): >>><<< 15406 1726854972.03466: stderr chunk (state=3): >>><<< 15406 1726854972.03470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854972.003354-17175-125098288889288=/root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.03473: variable 'ansible_module_compression' from source: unknown 15406 1726854972.03527: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15406 1726854972.03569: variable 'ansible_facts' from source: unknown 15406 1726854972.03679: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py 15406 1726854972.03822: Sending initial data 15406 1726854972.03935: Sent initial data (152 bytes) 15406 1726854972.04756: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.04881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854972.04902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.05147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.05215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.06788: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854972.06860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854972.06933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpil9a9ns1 /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py <<< 15406 1726854972.06936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py" <<< 15406 1726854972.07029: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpil9a9ns1" to remote "/root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py" <<< 15406 1726854972.08309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.08312: stdout chunk (state=3): >>><<< 15406 1726854972.08315: stderr chunk (state=3): >>><<< 15406 1726854972.08317: done transferring module to remote 15406 1726854972.08319: _low_level_execute_command(): starting 15406 1726854972.08322: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/ /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py && sleep 0' 15406 1726854972.09006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.09021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854972.09065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854972.09086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.09112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854972.09176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.09214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854972.09235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.09276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.09399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.11125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.11174: stderr chunk (state=3): >>><<< 15406 1726854972.11190: stdout chunk (state=3): >>><<< 15406 1726854972.11216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.11299: _low_level_execute_command(): starting 15406 1726854972.11303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/AnsiballZ_stat.py && sleep 0' 15406 1726854972.12290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.12309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854972.12325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.12344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854972.12482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.12634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.12712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.27901: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15406 1726854972.29203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854972.29207: stdout chunk (state=3): >>><<< 15406 1726854972.29209: stderr chunk (state=3): >>><<< 15406 1726854972.29304: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854972.29312: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854972.29318: _low_level_execute_command(): starting 15406 1726854972.29320: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854972.003354-17175-125098288889288/ > /dev/null 2>&1 && sleep 0' 15406 1726854972.29938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.29943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854972.29975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.29978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.29980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.30029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854972.30033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.30042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.30118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.32132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.32135: stdout chunk (state=3): >>><<< 15406 1726854972.32137: stderr chunk (state=3): >>><<< 15406 1726854972.32139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.32141: handler run complete 15406 1726854972.32142: attempt loop complete, returning result 15406 1726854972.32144: _execute() done 15406 1726854972.32146: dumping result to json 15406 1726854972.32147: done dumping result, returning 15406 1726854972.32149: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcc66-ac2b-3c83-32d3-00000000048b] 15406 1726854972.32150: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048b 15406 1726854972.32221: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048b 15406 1726854972.32224: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15406 1726854972.32303: no more pending results, returning what we have 15406 1726854972.32306: results queue empty 15406 1726854972.32307: checking for any_errors_fatal 15406 1726854972.32312: done checking for any_errors_fatal 15406 1726854972.32313: checking for max_fail_percentage 15406 1726854972.32315: done checking for max_fail_percentage 15406 1726854972.32315: checking to see if all hosts have failed and the running result is not ok 15406 1726854972.32316: done checking to see if all hosts have failed 15406 1726854972.32317: getting the remaining hosts for this loop 15406 1726854972.32318: done getting the remaining hosts for this loop 15406 1726854972.32322: getting the next task for host managed_node2 15406 1726854972.32327: done getting next task for host managed_node2 15406 1726854972.32329: ^ task is: TASK: Set NM profile exist flag based on the profile files 15406 1726854972.32332: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854972.32336: getting variables 15406 1726854972.32337: in VariableManager get_vars() 15406 1726854972.32474: Calling all_inventory to load vars for managed_node2 15406 1726854972.32477: Calling groups_inventory to load vars for managed_node2 15406 1726854972.32481: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854972.32494: Calling all_plugins_play to load vars for managed_node2 15406 1726854972.32497: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854972.32500: Calling groups_plugins_play to load vars for managed_node2 15406 1726854972.33756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854972.34982: done with get_vars() 15406 1726854972.35000: done getting variables 15406 1726854972.35063: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:56:12 -0400 (0:00:00.421) 0:00:40.173 ****** 15406 1726854972.35108: entering _queue_task() for managed_node2/set_fact 15406 1726854972.35620: worker is 1 (out of 1 available) 15406 1726854972.35641: exiting _queue_task() for managed_node2/set_fact 15406 1726854972.35714: done queuing things up, now waiting for results queue to drain 15406 1726854972.35716: waiting for pending results... 15406 1726854972.36005: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 15406 1726854972.36085: in run() - task 0affcc66-ac2b-3c83-32d3-00000000048c 15406 1726854972.36111: variable 'ansible_search_path' from source: unknown 15406 1726854972.36125: variable 'ansible_search_path' from source: unknown 15406 1726854972.36173: calling self._execute() 15406 1726854972.36268: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854972.36272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854972.36285: variable 'omit' from source: magic vars 15406 1726854972.36849: variable 'ansible_distribution_major_version' from source: facts 15406 1726854972.36853: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854972.36892: variable 'profile_stat' from source: set_fact 15406 1726854972.37097: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854972.37103: when evaluation is False, skipping this task 15406 1726854972.37106: _execute() done 15406 1726854972.37108: dumping result to json 15406 1726854972.37114: done dumping result, returning 15406 1726854972.37118: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-3c83-32d3-00000000048c] 15406 1726854972.37120: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048c 15406 1726854972.37196: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048c 15406 1726854972.37200: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854972.37293: no more pending results, returning what we have 15406 1726854972.37299: results queue empty 15406 1726854972.37300: checking for any_errors_fatal 15406 1726854972.37308: done checking for any_errors_fatal 15406 1726854972.37309: checking for max_fail_percentage 15406 1726854972.37311: done checking for max_fail_percentage 15406 1726854972.37311: checking to see if all hosts have failed and the running result is not ok 15406 1726854972.37312: done checking to see if all hosts have failed 15406 1726854972.37313: getting the remaining hosts for this loop 15406 1726854972.37314: done getting the remaining hosts for this loop 15406 1726854972.37317: getting the next task for host managed_node2 15406 1726854972.37323: done getting next task for host managed_node2 15406 1726854972.37326: ^ task is: TASK: Get NM profile info 15406 1726854972.37329: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854972.37333: getting variables 15406 1726854972.37334: in VariableManager get_vars() 15406 1726854972.37362: Calling all_inventory to load vars for managed_node2 15406 1726854972.37365: Calling groups_inventory to load vars for managed_node2 15406 1726854972.37371: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854972.37385: Calling all_plugins_play to load vars for managed_node2 15406 1726854972.37391: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854972.37399: Calling groups_plugins_play to load vars for managed_node2 15406 1726854972.39432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854972.40580: done with get_vars() 15406 1726854972.40602: done getting variables 15406 1726854972.40643: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:56:12 -0400 (0:00:00.055) 0:00:40.229 ****** 15406 1726854972.40666: entering _queue_task() for managed_node2/shell 15406 1726854972.40912: worker is 1 (out of 1 available) 15406 1726854972.40926: exiting _queue_task() for managed_node2/shell 15406 1726854972.40937: done queuing things up, now waiting for results queue to drain 15406 1726854972.40938: waiting for pending results... 15406 1726854972.41110: running TaskExecutor() for managed_node2/TASK: Get NM profile info 15406 1726854972.41185: in run() - task 0affcc66-ac2b-3c83-32d3-00000000048d 15406 1726854972.41199: variable 'ansible_search_path' from source: unknown 15406 1726854972.41202: variable 'ansible_search_path' from source: unknown 15406 1726854972.41227: calling self._execute() 15406 1726854972.41300: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854972.41304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854972.41310: variable 'omit' from source: magic vars 15406 1726854972.41692: variable 'ansible_distribution_major_version' from source: facts 15406 1726854972.41699: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854972.41701: variable 'omit' from source: magic vars 15406 1726854972.41821: variable 'omit' from source: magic vars 15406 1726854972.41914: variable 'profile' from source: play vars 15406 1726854972.41923: variable 'interface' from source: set_fact 15406 1726854972.41979: variable 'interface' from source: set_fact 15406 1726854972.42006: variable 'omit' from source: magic vars 15406 1726854972.42050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854972.42100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854972.42114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854972.42133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854972.42148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854972.42181: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854972.42191: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854972.42200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854972.42302: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854972.42314: Set connection var ansible_timeout to 10 15406 1726854972.42322: Set connection var ansible_connection to ssh 15406 1726854972.42331: Set connection var ansible_shell_type to sh 15406 1726854972.42338: Set connection var ansible_shell_executable to /bin/sh 15406 1726854972.42349: Set connection var ansible_pipelining to False 15406 1726854972.42374: variable 'ansible_shell_executable' from source: unknown 15406 1726854972.42380: variable 'ansible_connection' from source: unknown 15406 1726854972.42386: variable 'ansible_module_compression' from source: unknown 15406 1726854972.42396: variable 'ansible_shell_type' from source: unknown 15406 1726854972.42403: variable 'ansible_shell_executable' from source: unknown 15406 1726854972.42410: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854972.42592: variable 'ansible_pipelining' from source: unknown 15406 1726854972.42596: variable 'ansible_timeout' from source: unknown 15406 1726854972.42598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854972.42601: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854972.42603: variable 'omit' from source: magic vars 15406 1726854972.42605: starting attempt loop 15406 1726854972.42607: running the handler 15406 1726854972.42610: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854972.42619: _low_level_execute_command(): starting 15406 1726854972.42630: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854972.43713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.43789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854972.43866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.43908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854972.44015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.44059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.44160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.45772: stdout chunk (state=3): >>>/root <<< 15406 1726854972.45916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.45927: stdout chunk (state=3): >>><<< 15406 1726854972.45938: stderr chunk (state=3): >>><<< 15406 1726854972.45969: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.45993: _low_level_execute_command(): starting 15406 1726854972.46009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390 `" && echo ansible-tmp-1726854972.4597716-17202-120113926209390="` echo /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390 `" ) && sleep 0' 15406 1726854972.46678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.46713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854972.46726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.46750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854972.46854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854972.46907: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.47013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854972.47165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.47191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.47331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.49236: stdout chunk (state=3): >>>ansible-tmp-1726854972.4597716-17202-120113926209390=/root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390 <<< 15406 1726854972.49711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.49716: stdout chunk (state=3): >>><<< 15406 1726854972.49718: stderr chunk (state=3): >>><<< 15406 1726854972.49721: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854972.4597716-17202-120113926209390=/root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.49724: variable 'ansible_module_compression' from source: unknown 15406 1726854972.49937: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15406 1726854972.50076: variable 'ansible_facts' from source: unknown 15406 1726854972.50538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py 15406 1726854972.50895: Sending initial data 15406 1726854972.50966: Sent initial data (156 bytes) 15406 1726854972.52207: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.52348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.52607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.52733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.54349: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15406 1726854972.54443: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854972.54498: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854972.54564: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp5o0c6yoz /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py <<< 15406 1726854972.54596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py" <<< 15406 1726854972.54651: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp5o0c6yoz" to remote "/root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py" <<< 15406 1726854972.56073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.56076: stdout chunk (state=3): >>><<< 15406 1726854972.56078: stderr chunk (state=3): >>><<< 15406 1726854972.56080: done transferring module to remote 15406 1726854972.56082: _low_level_execute_command(): starting 15406 1726854972.56085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/ /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py && sleep 0' 15406 1726854972.57057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.57112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.57149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854972.57245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.57311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.57384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.59200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.59211: stdout chunk (state=3): >>><<< 15406 1726854972.59261: stderr chunk (state=3): >>><<< 15406 1726854972.59359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.59363: _low_level_execute_command(): starting 15406 1726854972.59366: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/AnsiballZ_command.py && sleep 0' 15406 1726854972.60224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.60238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.60337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.76991: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:56:12.751865", "end": "2024-09-20 13:56:12.767441", "delta": "0:00:00.015576", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15406 1726854972.78323: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 15406 1726854972.78390: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 15406 1726854972.78432: stdout chunk (state=3): >>><<< 15406 1726854972.78445: stderr chunk (state=3): >>><<< 15406 1726854972.78519: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:56:12.751865", "end": "2024-09-20 13:56:12.767441", "delta": "0:00:00.015576", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.178 closed. 15406 1726854972.78698: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854972.78701: _low_level_execute_command(): starting 15406 1726854972.78704: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854972.4597716-17202-120113926209390/ > /dev/null 2>&1 && sleep 0' 15406 1726854972.80436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854972.80450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854972.80465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854972.80692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854972.80811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854972.80917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854972.82801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854972.82804: stdout chunk (state=3): >>><<< 15406 1726854972.82806: stderr chunk (state=3): >>><<< 15406 1726854972.82990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854972.82994: handler run complete 15406 1726854972.82999: Evaluated conditional (False): False 15406 1726854972.83001: attempt loop complete, returning result 15406 1726854972.83004: _execute() done 15406 1726854972.83006: dumping result to json 15406 1726854972.83008: done dumping result, returning 15406 1726854972.83010: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcc66-ac2b-3c83-32d3-00000000048d] 15406 1726854972.83012: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048d fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.015576", "end": "2024-09-20 13:56:12.767441", "rc": 1, "start": "2024-09-20 13:56:12.751865" } MSG: non-zero return code ...ignoring 15406 1726854972.83270: no more pending results, returning what we have 15406 1726854972.83274: results queue empty 15406 1726854972.83275: checking for any_errors_fatal 15406 1726854972.83285: done checking for any_errors_fatal 15406 1726854972.83286: checking for max_fail_percentage 15406 1726854972.83491: done checking for max_fail_percentage 15406 1726854972.83493: checking to see if all hosts have failed and the running result is not ok 15406 1726854972.83497: done checking to see if all hosts have failed 15406 1726854972.83498: getting the remaining hosts for this loop 15406 1726854972.83499: done getting the remaining hosts for this loop 15406 1726854972.83504: getting the next task for host managed_node2 15406 1726854972.83513: done getting next task for host managed_node2 15406 1726854972.83516: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15406 1726854972.83520: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854972.83526: getting variables 15406 1726854972.83528: in VariableManager get_vars() 15406 1726854972.83561: Calling all_inventory to load vars for managed_node2 15406 1726854972.83563: Calling groups_inventory to load vars for managed_node2 15406 1726854972.83568: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854972.83581: Calling all_plugins_play to load vars for managed_node2 15406 1726854972.83584: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854972.83992: Calling groups_plugins_play to load vars for managed_node2 15406 1726854972.84798: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048d 15406 1726854972.84802: WORKER PROCESS EXITING 15406 1726854972.87220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854972.91493: done with get_vars() 15406 1726854972.91527: done getting variables 15406 1726854972.91583: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:56:12 -0400 (0:00:00.509) 0:00:40.738 ****** 15406 1726854972.91618: entering _queue_task() for managed_node2/set_fact 15406 1726854972.92373: worker is 1 (out of 1 available) 15406 1726854972.92389: exiting _queue_task() for managed_node2/set_fact 15406 1726854972.92405: done queuing things up, now waiting for results queue to drain 15406 1726854972.92406: waiting for pending results... 15406 1726854972.93105: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15406 1726854972.93110: in run() - task 0affcc66-ac2b-3c83-32d3-00000000048e 15406 1726854972.93113: variable 'ansible_search_path' from source: unknown 15406 1726854972.93116: variable 'ansible_search_path' from source: unknown 15406 1726854972.93327: calling self._execute() 15406 1726854972.93421: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854972.93433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854972.93449: variable 'omit' from source: magic vars 15406 1726854972.94170: variable 'ansible_distribution_major_version' from source: facts 15406 1726854972.94306: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854972.94792: variable 'nm_profile_exists' from source: set_fact 15406 1726854972.94796: Evaluated conditional (nm_profile_exists.rc == 0): False 15406 1726854972.94798: when evaluation is False, skipping this task 15406 1726854972.94800: _execute() done 15406 1726854972.94802: dumping result to json 15406 1726854972.94804: done dumping result, returning 15406 1726854972.94807: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-3c83-32d3-00000000048e] 15406 1726854972.94808: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048e 15406 1726854972.94871: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000048e 15406 1726854972.94875: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15406 1726854972.94928: no more pending results, returning what we have 15406 1726854972.94932: results queue empty 15406 1726854972.94933: checking for any_errors_fatal 15406 1726854972.94942: done checking for any_errors_fatal 15406 1726854972.94943: checking for max_fail_percentage 15406 1726854972.94945: done checking for max_fail_percentage 15406 1726854972.94945: checking to see if all hosts have failed and the running result is not ok 15406 1726854972.94947: done checking to see if all hosts have failed 15406 1726854972.94948: getting the remaining hosts for this loop 15406 1726854972.94949: done getting the remaining hosts for this loop 15406 1726854972.94953: getting the next task for host managed_node2 15406 1726854972.94962: done getting next task for host managed_node2 15406 1726854972.94965: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15406 1726854972.94970: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854972.94974: getting variables 15406 1726854972.94976: in VariableManager get_vars() 15406 1726854972.95010: Calling all_inventory to load vars for managed_node2 15406 1726854972.95012: Calling groups_inventory to load vars for managed_node2 15406 1726854972.95017: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854972.95030: Calling all_plugins_play to load vars for managed_node2 15406 1726854972.95033: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854972.95035: Calling groups_plugins_play to load vars for managed_node2 15406 1726854972.98344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.00820: done with get_vars() 15406 1726854973.00844: done getting variables 15406 1726854973.00910: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854973.01034: variable 'profile' from source: play vars 15406 1726854973.01038: variable 'interface' from source: set_fact 15406 1726854973.01100: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:56:13 -0400 (0:00:00.095) 0:00:40.834 ****** 15406 1726854973.01132: entering _queue_task() for managed_node2/command 15406 1726854973.01526: worker is 1 (out of 1 available) 15406 1726854973.01539: exiting _queue_task() for managed_node2/command 15406 1726854973.01551: done queuing things up, now waiting for results queue to drain 15406 1726854973.01553: waiting for pending results... 15406 1726854973.01843: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15406 1726854973.01986: in run() - task 0affcc66-ac2b-3c83-32d3-000000000490 15406 1726854973.02011: variable 'ansible_search_path' from source: unknown 15406 1726854973.02022: variable 'ansible_search_path' from source: unknown 15406 1726854973.02061: calling self._execute() 15406 1726854973.02158: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.02169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.02189: variable 'omit' from source: magic vars 15406 1726854973.02573: variable 'ansible_distribution_major_version' from source: facts 15406 1726854973.02593: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854973.02721: variable 'profile_stat' from source: set_fact 15406 1726854973.02741: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854973.02750: when evaluation is False, skipping this task 15406 1726854973.02757: _execute() done 15406 1726854973.02762: dumping result to json 15406 1726854973.02769: done dumping result, returning 15406 1726854973.02788: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000490] 15406 1726854973.02800: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000490 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854973.03078: no more pending results, returning what we have 15406 1726854973.03081: results queue empty 15406 1726854973.03083: checking for any_errors_fatal 15406 1726854973.03099: done checking for any_errors_fatal 15406 1726854973.03101: checking for max_fail_percentage 15406 1726854973.03103: done checking for max_fail_percentage 15406 1726854973.03104: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.03104: done checking to see if all hosts have failed 15406 1726854973.03105: getting the remaining hosts for this loop 15406 1726854973.03106: done getting the remaining hosts for this loop 15406 1726854973.03111: getting the next task for host managed_node2 15406 1726854973.03123: done getting next task for host managed_node2 15406 1726854973.03126: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15406 1726854973.03131: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.03135: getting variables 15406 1726854973.03137: in VariableManager get_vars() 15406 1726854973.03164: Calling all_inventory to load vars for managed_node2 15406 1726854973.03166: Calling groups_inventory to load vars for managed_node2 15406 1726854973.03169: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.03183: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.03185: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.03206: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000490 15406 1726854973.03209: WORKER PROCESS EXITING 15406 1726854973.03316: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.04840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.06866: done with get_vars() 15406 1726854973.06900: done getting variables 15406 1726854973.06955: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854973.07307: variable 'profile' from source: play vars 15406 1726854973.07311: variable 'interface' from source: set_fact 15406 1726854973.07375: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:56:13 -0400 (0:00:00.063) 0:00:40.897 ****** 15406 1726854973.07472: entering _queue_task() for managed_node2/set_fact 15406 1726854973.08154: worker is 1 (out of 1 available) 15406 1726854973.08166: exiting _queue_task() for managed_node2/set_fact 15406 1726854973.08177: done queuing things up, now waiting for results queue to drain 15406 1726854973.08178: waiting for pending results... 15406 1726854973.08362: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15406 1726854973.08519: in run() - task 0affcc66-ac2b-3c83-32d3-000000000491 15406 1726854973.08538: variable 'ansible_search_path' from source: unknown 15406 1726854973.08546: variable 'ansible_search_path' from source: unknown 15406 1726854973.08585: calling self._execute() 15406 1726854973.08676: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.08689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.08728: variable 'omit' from source: magic vars 15406 1726854973.09079: variable 'ansible_distribution_major_version' from source: facts 15406 1726854973.09099: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854973.09247: variable 'profile_stat' from source: set_fact 15406 1726854973.09253: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854973.09269: when evaluation is False, skipping this task 15406 1726854973.09357: _execute() done 15406 1726854973.09361: dumping result to json 15406 1726854973.09364: done dumping result, returning 15406 1726854973.09367: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000491] 15406 1726854973.09370: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000491 15406 1726854973.09446: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000491 15406 1726854973.09450: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854973.09515: no more pending results, returning what we have 15406 1726854973.09519: results queue empty 15406 1726854973.09520: checking for any_errors_fatal 15406 1726854973.09528: done checking for any_errors_fatal 15406 1726854973.09529: checking for max_fail_percentage 15406 1726854973.09531: done checking for max_fail_percentage 15406 1726854973.09532: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.09533: done checking to see if all hosts have failed 15406 1726854973.09533: getting the remaining hosts for this loop 15406 1726854973.09535: done getting the remaining hosts for this loop 15406 1726854973.09539: getting the next task for host managed_node2 15406 1726854973.09546: done getting next task for host managed_node2 15406 1726854973.09549: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15406 1726854973.09553: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.09558: getting variables 15406 1726854973.09560: in VariableManager get_vars() 15406 1726854973.09594: Calling all_inventory to load vars for managed_node2 15406 1726854973.09597: Calling groups_inventory to load vars for managed_node2 15406 1726854973.09601: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.09615: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.09623: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.09627: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.11768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.13916: done with get_vars() 15406 1726854973.13955: done getting variables 15406 1726854973.14016: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854973.14134: variable 'profile' from source: play vars 15406 1726854973.14138: variable 'interface' from source: set_fact 15406 1726854973.14203: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:56:13 -0400 (0:00:00.067) 0:00:40.965 ****** 15406 1726854973.14235: entering _queue_task() for managed_node2/command 15406 1726854973.14659: worker is 1 (out of 1 available) 15406 1726854973.14671: exiting _queue_task() for managed_node2/command 15406 1726854973.14791: done queuing things up, now waiting for results queue to drain 15406 1726854973.14801: waiting for pending results... 15406 1726854973.15142: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15406 1726854973.15147: in run() - task 0affcc66-ac2b-3c83-32d3-000000000492 15406 1726854973.15150: variable 'ansible_search_path' from source: unknown 15406 1726854973.15153: variable 'ansible_search_path' from source: unknown 15406 1726854973.15170: calling self._execute() 15406 1726854973.15272: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.15284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.15306: variable 'omit' from source: magic vars 15406 1726854973.15704: variable 'ansible_distribution_major_version' from source: facts 15406 1726854973.15720: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854973.15849: variable 'profile_stat' from source: set_fact 15406 1726854973.15868: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854973.15875: when evaluation is False, skipping this task 15406 1726854973.15882: _execute() done 15406 1726854973.15902: dumping result to json 15406 1726854973.15911: done dumping result, returning 15406 1726854973.15922: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000492] 15406 1726854973.15932: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000492 15406 1726854973.16064: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000492 15406 1726854973.16067: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854973.16158: no more pending results, returning what we have 15406 1726854973.16163: results queue empty 15406 1726854973.16164: checking for any_errors_fatal 15406 1726854973.16173: done checking for any_errors_fatal 15406 1726854973.16174: checking for max_fail_percentage 15406 1726854973.16175: done checking for max_fail_percentage 15406 1726854973.16176: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.16176: done checking to see if all hosts have failed 15406 1726854973.16177: getting the remaining hosts for this loop 15406 1726854973.16179: done getting the remaining hosts for this loop 15406 1726854973.16183: getting the next task for host managed_node2 15406 1726854973.16192: done getting next task for host managed_node2 15406 1726854973.16198: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15406 1726854973.16202: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.16207: getting variables 15406 1726854973.16209: in VariableManager get_vars() 15406 1726854973.16239: Calling all_inventory to load vars for managed_node2 15406 1726854973.16242: Calling groups_inventory to load vars for managed_node2 15406 1726854973.16246: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.16260: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.16263: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.16267: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.18016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.19752: done with get_vars() 15406 1726854973.19783: done getting variables 15406 1726854973.19851: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854973.20038: variable 'profile' from source: play vars 15406 1726854973.20042: variable 'interface' from source: set_fact 15406 1726854973.20111: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:56:13 -0400 (0:00:00.059) 0:00:41.024 ****** 15406 1726854973.20146: entering _queue_task() for managed_node2/set_fact 15406 1726854973.20692: worker is 1 (out of 1 available) 15406 1726854973.20706: exiting _queue_task() for managed_node2/set_fact 15406 1726854973.20720: done queuing things up, now waiting for results queue to drain 15406 1726854973.20721: waiting for pending results... 15406 1726854973.21114: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15406 1726854973.21328: in run() - task 0affcc66-ac2b-3c83-32d3-000000000493 15406 1726854973.21332: variable 'ansible_search_path' from source: unknown 15406 1726854973.21335: variable 'ansible_search_path' from source: unknown 15406 1726854973.21432: calling self._execute() 15406 1726854973.21493: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.21509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.21532: variable 'omit' from source: magic vars 15406 1726854973.22223: variable 'ansible_distribution_major_version' from source: facts 15406 1726854973.22227: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854973.22229: variable 'profile_stat' from source: set_fact 15406 1726854973.22232: Evaluated conditional (profile_stat.stat.exists): False 15406 1726854973.22233: when evaluation is False, skipping this task 15406 1726854973.22256: _execute() done 15406 1726854973.22271: dumping result to json 15406 1726854973.22280: done dumping result, returning 15406 1726854973.22298: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-000000000493] 15406 1726854973.22309: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000493 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15406 1726854973.22522: no more pending results, returning what we have 15406 1726854973.22527: results queue empty 15406 1726854973.22528: checking for any_errors_fatal 15406 1726854973.22546: done checking for any_errors_fatal 15406 1726854973.22548: checking for max_fail_percentage 15406 1726854973.22550: done checking for max_fail_percentage 15406 1726854973.22551: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.22564: done checking to see if all hosts have failed 15406 1726854973.22565: getting the remaining hosts for this loop 15406 1726854973.22567: done getting the remaining hosts for this loop 15406 1726854973.22573: getting the next task for host managed_node2 15406 1726854973.22594: done getting next task for host managed_node2 15406 1726854973.22600: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15406 1726854973.22604: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.22659: getting variables 15406 1726854973.22662: in VariableManager get_vars() 15406 1726854973.22768: Calling all_inventory to load vars for managed_node2 15406 1726854973.22772: Calling groups_inventory to load vars for managed_node2 15406 1726854973.22789: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.22852: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.22860: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.22866: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000493 15406 1726854973.22882: WORKER PROCESS EXITING 15406 1726854973.22906: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.24886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.26543: done with get_vars() 15406 1726854973.26593: done getting variables 15406 1726854973.26661: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854973.26798: variable 'profile' from source: play vars 15406 1726854973.26802: variable 'interface' from source: set_fact 15406 1726854973.26907: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:56:13 -0400 (0:00:00.067) 0:00:41.092 ****** 15406 1726854973.26945: entering _queue_task() for managed_node2/assert 15406 1726854973.27482: worker is 1 (out of 1 available) 15406 1726854973.27523: exiting _queue_task() for managed_node2/assert 15406 1726854973.27540: done queuing things up, now waiting for results queue to drain 15406 1726854973.27542: waiting for pending results... 15406 1726854973.27875: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15406 1726854973.28014: in run() - task 0affcc66-ac2b-3c83-32d3-000000000480 15406 1726854973.28040: variable 'ansible_search_path' from source: unknown 15406 1726854973.28112: variable 'ansible_search_path' from source: unknown 15406 1726854973.28115: calling self._execute() 15406 1726854973.28177: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.28194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.28222: variable 'omit' from source: magic vars 15406 1726854973.28706: variable 'ansible_distribution_major_version' from source: facts 15406 1726854973.28725: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854973.28735: variable 'omit' from source: magic vars 15406 1726854973.28779: variable 'omit' from source: magic vars 15406 1726854973.28948: variable 'profile' from source: play vars 15406 1726854973.28957: variable 'interface' from source: set_fact 15406 1726854973.29034: variable 'interface' from source: set_fact 15406 1726854973.29091: variable 'omit' from source: magic vars 15406 1726854973.29124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854973.29163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854973.29186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854973.29213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854973.29305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854973.29308: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854973.29311: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.29313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.29383: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854973.29397: Set connection var ansible_timeout to 10 15406 1726854973.29406: Set connection var ansible_connection to ssh 15406 1726854973.29419: Set connection var ansible_shell_type to sh 15406 1726854973.29428: Set connection var ansible_shell_executable to /bin/sh 15406 1726854973.29443: Set connection var ansible_pipelining to False 15406 1726854973.29469: variable 'ansible_shell_executable' from source: unknown 15406 1726854973.29522: variable 'ansible_connection' from source: unknown 15406 1726854973.29524: variable 'ansible_module_compression' from source: unknown 15406 1726854973.29527: variable 'ansible_shell_type' from source: unknown 15406 1726854973.29528: variable 'ansible_shell_executable' from source: unknown 15406 1726854973.29530: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.29532: variable 'ansible_pipelining' from source: unknown 15406 1726854973.29534: variable 'ansible_timeout' from source: unknown 15406 1726854973.29536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.29667: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854973.29683: variable 'omit' from source: magic vars 15406 1726854973.29696: starting attempt loop 15406 1726854973.29703: running the handler 15406 1726854973.29834: variable 'lsr_net_profile_exists' from source: set_fact 15406 1726854973.29881: Evaluated conditional (not lsr_net_profile_exists): True 15406 1726854973.29885: handler run complete 15406 1726854973.29888: attempt loop complete, returning result 15406 1726854973.29891: _execute() done 15406 1726854973.29893: dumping result to json 15406 1726854973.29895: done dumping result, returning 15406 1726854973.29903: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' [0affcc66-ac2b-3c83-32d3-000000000480] 15406 1726854973.29911: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000480 15406 1726854973.30039: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000480 15406 1726854973.30042: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854973.30179: no more pending results, returning what we have 15406 1726854973.30183: results queue empty 15406 1726854973.30185: checking for any_errors_fatal 15406 1726854973.30202: done checking for any_errors_fatal 15406 1726854973.30203: checking for max_fail_percentage 15406 1726854973.30206: done checking for max_fail_percentage 15406 1726854973.30207: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.30208: done checking to see if all hosts have failed 15406 1726854973.30208: getting the remaining hosts for this loop 15406 1726854973.30210: done getting the remaining hosts for this loop 15406 1726854973.30213: getting the next task for host managed_node2 15406 1726854973.30228: done getting next task for host managed_node2 15406 1726854973.30231: ^ task is: TASK: meta (flush_handlers) 15406 1726854973.30234: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.30239: getting variables 15406 1726854973.30241: in VariableManager get_vars() 15406 1726854973.30270: Calling all_inventory to load vars for managed_node2 15406 1726854973.30274: Calling groups_inventory to load vars for managed_node2 15406 1726854973.30279: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.30385: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.30391: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.30512: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.32046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.33879: done with get_vars() 15406 1726854973.33908: done getting variables 15406 1726854973.34020: in VariableManager get_vars() 15406 1726854973.34037: Calling all_inventory to load vars for managed_node2 15406 1726854973.34039: Calling groups_inventory to load vars for managed_node2 15406 1726854973.34043: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.34048: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.34054: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.34057: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.35511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.37158: done with get_vars() 15406 1726854973.37221: done queuing things up, now waiting for results queue to drain 15406 1726854973.37224: results queue empty 15406 1726854973.37225: checking for any_errors_fatal 15406 1726854973.37228: done checking for any_errors_fatal 15406 1726854973.37228: checking for max_fail_percentage 15406 1726854973.37230: done checking for max_fail_percentage 15406 1726854973.37230: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.37237: done checking to see if all hosts have failed 15406 1726854973.37238: getting the remaining hosts for this loop 15406 1726854973.37239: done getting the remaining hosts for this loop 15406 1726854973.37242: getting the next task for host managed_node2 15406 1726854973.37246: done getting next task for host managed_node2 15406 1726854973.37248: ^ task is: TASK: meta (flush_handlers) 15406 1726854973.37250: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.37253: getting variables 15406 1726854973.37254: in VariableManager get_vars() 15406 1726854973.37263: Calling all_inventory to load vars for managed_node2 15406 1726854973.37266: Calling groups_inventory to load vars for managed_node2 15406 1726854973.37268: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.37273: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.37276: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.37278: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.38511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.40164: done with get_vars() 15406 1726854973.40183: done getting variables 15406 1726854973.40237: in VariableManager get_vars() 15406 1726854973.40251: Calling all_inventory to load vars for managed_node2 15406 1726854973.40254: Calling groups_inventory to load vars for managed_node2 15406 1726854973.40256: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.40261: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.40263: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.40266: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.41461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.43118: done with get_vars() 15406 1726854973.43148: done queuing things up, now waiting for results queue to drain 15406 1726854973.43150: results queue empty 15406 1726854973.43151: checking for any_errors_fatal 15406 1726854973.43152: done checking for any_errors_fatal 15406 1726854973.43153: checking for max_fail_percentage 15406 1726854973.43154: done checking for max_fail_percentage 15406 1726854973.43155: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.43155: done checking to see if all hosts have failed 15406 1726854973.43156: getting the remaining hosts for this loop 15406 1726854973.43157: done getting the remaining hosts for this loop 15406 1726854973.43160: getting the next task for host managed_node2 15406 1726854973.43163: done getting next task for host managed_node2 15406 1726854973.43164: ^ task is: None 15406 1726854973.43165: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.43166: done queuing things up, now waiting for results queue to drain 15406 1726854973.43167: results queue empty 15406 1726854973.43168: checking for any_errors_fatal 15406 1726854973.43169: done checking for any_errors_fatal 15406 1726854973.43169: checking for max_fail_percentage 15406 1726854973.43171: done checking for max_fail_percentage 15406 1726854973.43171: checking to see if all hosts have failed and the running result is not ok 15406 1726854973.43172: done checking to see if all hosts have failed 15406 1726854973.43177: getting the next task for host managed_node2 15406 1726854973.43180: done getting next task for host managed_node2 15406 1726854973.43181: ^ task is: None 15406 1726854973.43183: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.43226: in VariableManager get_vars() 15406 1726854973.43243: done with get_vars() 15406 1726854973.43249: in VariableManager get_vars() 15406 1726854973.43259: done with get_vars() 15406 1726854973.43264: variable 'omit' from source: magic vars 15406 1726854973.43380: variable 'task' from source: play vars 15406 1726854973.43420: in VariableManager get_vars() 15406 1726854973.43431: done with get_vars() 15406 1726854973.43450: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15406 1726854973.43691: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854973.43717: getting the remaining hosts for this loop 15406 1726854973.43718: done getting the remaining hosts for this loop 15406 1726854973.43721: getting the next task for host managed_node2 15406 1726854973.43724: done getting next task for host managed_node2 15406 1726854973.43726: ^ task is: TASK: Gathering Facts 15406 1726854973.43727: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854973.43729: getting variables 15406 1726854973.43735: in VariableManager get_vars() 15406 1726854973.43743: Calling all_inventory to load vars for managed_node2 15406 1726854973.43746: Calling groups_inventory to load vars for managed_node2 15406 1726854973.43748: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854973.43754: Calling all_plugins_play to load vars for managed_node2 15406 1726854973.43756: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854973.43760: Calling groups_plugins_play to load vars for managed_node2 15406 1726854973.45128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854973.46785: done with get_vars() 15406 1726854973.46811: done getting variables 15406 1726854973.46858: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:56:13 -0400 (0:00:00.199) 0:00:41.291 ****** 15406 1726854973.46892: entering _queue_task() for managed_node2/gather_facts 15406 1726854973.47249: worker is 1 (out of 1 available) 15406 1726854973.47262: exiting _queue_task() for managed_node2/gather_facts 15406 1726854973.47276: done queuing things up, now waiting for results queue to drain 15406 1726854973.47277: waiting for pending results... 15406 1726854973.47581: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854973.47706: in run() - task 0affcc66-ac2b-3c83-32d3-0000000004c5 15406 1726854973.47711: variable 'ansible_search_path' from source: unknown 15406 1726854973.47815: calling self._execute() 15406 1726854973.47854: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.47866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.47881: variable 'omit' from source: magic vars 15406 1726854973.48279: variable 'ansible_distribution_major_version' from source: facts 15406 1726854973.48300: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854973.48311: variable 'omit' from source: magic vars 15406 1726854973.48344: variable 'omit' from source: magic vars 15406 1726854973.48391: variable 'omit' from source: magic vars 15406 1726854973.48468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854973.48481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854973.48509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854973.48533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854973.48550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854973.48684: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854973.48690: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.48693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.48712: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854973.48723: Set connection var ansible_timeout to 10 15406 1726854973.48731: Set connection var ansible_connection to ssh 15406 1726854973.48742: Set connection var ansible_shell_type to sh 15406 1726854973.48752: Set connection var ansible_shell_executable to /bin/sh 15406 1726854973.48765: Set connection var ansible_pipelining to False 15406 1726854973.48793: variable 'ansible_shell_executable' from source: unknown 15406 1726854973.48805: variable 'ansible_connection' from source: unknown 15406 1726854973.48813: variable 'ansible_module_compression' from source: unknown 15406 1726854973.48820: variable 'ansible_shell_type' from source: unknown 15406 1726854973.48827: variable 'ansible_shell_executable' from source: unknown 15406 1726854973.48833: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854973.48992: variable 'ansible_pipelining' from source: unknown 15406 1726854973.48995: variable 'ansible_timeout' from source: unknown 15406 1726854973.48997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854973.49014: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854973.49029: variable 'omit' from source: magic vars 15406 1726854973.49037: starting attempt loop 15406 1726854973.49042: running the handler 15406 1726854973.49060: variable 'ansible_facts' from source: unknown 15406 1726854973.49079: _low_level_execute_command(): starting 15406 1726854973.49093: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854973.50001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854973.50014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854973.50035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854973.50060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854973.50063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854973.50171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854973.51879: stdout chunk (state=3): >>>/root <<< 15406 1726854973.52039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854973.52042: stdout chunk (state=3): >>><<< 15406 1726854973.52045: stderr chunk (state=3): >>><<< 15406 1726854973.52159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854973.52163: _low_level_execute_command(): starting 15406 1726854973.52166: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837 `" && echo ansible-tmp-1726854973.520719-17246-196703082594837="` echo /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837 `" ) && sleep 0' 15406 1726854973.52704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854973.52804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854973.52831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854973.52845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854973.52874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854973.52980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854973.54968: stdout chunk (state=3): >>>ansible-tmp-1726854973.520719-17246-196703082594837=/root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837 <<< 15406 1726854973.55024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854973.55107: stderr chunk (state=3): >>><<< 15406 1726854973.55118: stdout chunk (state=3): >>><<< 15406 1726854973.55294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854973.520719-17246-196703082594837=/root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854973.55301: variable 'ansible_module_compression' from source: unknown 15406 1726854973.55344: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854973.55643: variable 'ansible_facts' from source: unknown 15406 1726854973.56015: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py 15406 1726854973.56355: Sending initial data 15406 1726854973.56412: Sent initial data (153 bytes) 15406 1726854973.57525: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854973.57529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854973.57533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854973.57535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854973.57582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854973.57613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854973.57660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854973.58003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854973.59362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15406 1726854973.59378: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854973.59431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854973.59532: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpmxau40lr /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py <<< 15406 1726854973.59536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py" <<< 15406 1726854973.59618: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpmxau40lr" to remote "/root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py" <<< 15406 1726854973.61298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854973.61302: stdout chunk (state=3): >>><<< 15406 1726854973.61304: stderr chunk (state=3): >>><<< 15406 1726854973.61306: done transferring module to remote 15406 1726854973.61384: _low_level_execute_command(): starting 15406 1726854973.61394: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/ /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py && sleep 0' 15406 1726854973.61961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854973.61975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854973.62066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854973.62097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854973.62114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854973.62133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854973.62232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854973.64050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854973.64078: stdout chunk (state=3): >>><<< 15406 1726854973.64081: stderr chunk (state=3): >>><<< 15406 1726854973.64101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854973.64185: _low_level_execute_command(): starting 15406 1726854973.64192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/AnsiballZ_setup.py && sleep 0' 15406 1726854973.64799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854973.64813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854973.64854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854973.64865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854973.64965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854973.64989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854973.65099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.26324: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "t<<< 15406 1726854974.26343: stdout chunk (state=3): >>>tyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "13", "epoch": "1726854973", "epoch_int": "1726854973", "date": "2024-09-20", "time": "13:56:13", "iso8601_micro": "2024-09-20T17:56:13.922545Z", "iso8601": "2024-09-20T17:56:13Z", "iso8601_basic": "20240920T135613922545", "iso8601_basic_short": "20240920T135613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.2783203125, "5m": 0.33056640625, "15m": 0.1748046875}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 757, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795033088, "block_size": 4096, "block_total": 65519099, "block_available": 63914803, "block_used": 1604296, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854974.28299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854974.28329: stderr chunk (state=3): >>><<< 15406 1726854974.28332: stdout chunk (state=3): >>><<< 15406 1726854974.28364: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "13", "epoch": "1726854973", "epoch_int": "1726854973", "date": "2024-09-20", "time": "13:56:13", "iso8601_micro": "2024-09-20T17:56:13.922545Z", "iso8601": "2024-09-20T17:56:13Z", "iso8601_basic": "20240920T135613922545", "iso8601_basic_short": "20240920T135613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.2783203125, "5m": 0.33056640625, "15m": 0.1748046875}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 757, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795033088, "block_size": 4096, "block_total": 65519099, "block_available": 63914803, "block_used": 1604296, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854974.28583: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854974.28604: _low_level_execute_command(): starting 15406 1726854974.28607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854973.520719-17246-196703082594837/ > /dev/null 2>&1 && sleep 0' 15406 1726854974.29062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854974.29066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 <<< 15406 1726854974.29068: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.29070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854974.29072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.29126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854974.29129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.29208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.31013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854974.31037: stderr chunk (state=3): >>><<< 15406 1726854974.31040: stdout chunk (state=3): >>><<< 15406 1726854974.31052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854974.31061: handler run complete 15406 1726854974.31140: variable 'ansible_facts' from source: unknown 15406 1726854974.31217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.31415: variable 'ansible_facts' from source: unknown 15406 1726854974.31465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.31543: attempt loop complete, returning result 15406 1726854974.31547: _execute() done 15406 1726854974.31549: dumping result to json 15406 1726854974.31567: done dumping result, returning 15406 1726854974.31575: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-0000000004c5] 15406 1726854974.31579: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004c5 15406 1726854974.31851: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004c5 15406 1726854974.31854: WORKER PROCESS EXITING ok: [managed_node2] 15406 1726854974.32078: no more pending results, returning what we have 15406 1726854974.32081: results queue empty 15406 1726854974.32082: checking for any_errors_fatal 15406 1726854974.32083: done checking for any_errors_fatal 15406 1726854974.32084: checking for max_fail_percentage 15406 1726854974.32085: done checking for max_fail_percentage 15406 1726854974.32086: checking to see if all hosts have failed and the running result is not ok 15406 1726854974.32092: done checking to see if all hosts have failed 15406 1726854974.32093: getting the remaining hosts for this loop 15406 1726854974.32094: done getting the remaining hosts for this loop 15406 1726854974.32098: getting the next task for host managed_node2 15406 1726854974.32104: done getting next task for host managed_node2 15406 1726854974.32106: ^ task is: TASK: meta (flush_handlers) 15406 1726854974.32108: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854974.32112: getting variables 15406 1726854974.32114: in VariableManager get_vars() 15406 1726854974.32138: Calling all_inventory to load vars for managed_node2 15406 1726854974.32140: Calling groups_inventory to load vars for managed_node2 15406 1726854974.32142: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.32150: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.32152: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.32153: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.33178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.35339: done with get_vars() 15406 1726854974.35364: done getting variables 15406 1726854974.35456: in VariableManager get_vars() 15406 1726854974.35464: Calling all_inventory to load vars for managed_node2 15406 1726854974.35466: Calling groups_inventory to load vars for managed_node2 15406 1726854974.35467: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.35471: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.35472: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.35474: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.36513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.37508: done with get_vars() 15406 1726854974.37526: done queuing things up, now waiting for results queue to drain 15406 1726854974.37527: results queue empty 15406 1726854974.37528: checking for any_errors_fatal 15406 1726854974.37530: done checking for any_errors_fatal 15406 1726854974.37531: checking for max_fail_percentage 15406 1726854974.37535: done checking for max_fail_percentage 15406 1726854974.37536: checking to see if all hosts have failed and the running result is not ok 15406 1726854974.37536: done checking to see if all hosts have failed 15406 1726854974.37537: getting the remaining hosts for this loop 15406 1726854974.37537: done getting the remaining hosts for this loop 15406 1726854974.37539: getting the next task for host managed_node2 15406 1726854974.37542: done getting next task for host managed_node2 15406 1726854974.37543: ^ task is: TASK: Include the task '{{ task }}' 15406 1726854974.37544: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854974.37546: getting variables 15406 1726854974.37546: in VariableManager get_vars() 15406 1726854974.37552: Calling all_inventory to load vars for managed_node2 15406 1726854974.37553: Calling groups_inventory to load vars for managed_node2 15406 1726854974.37555: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.37558: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.37559: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.37561: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.38231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.39089: done with get_vars() 15406 1726854974.39116: done getting variables 15406 1726854974.39274: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:56:14 -0400 (0:00:00.924) 0:00:42.215 ****** 15406 1726854974.39306: entering _queue_task() for managed_node2/include_tasks 15406 1726854974.39645: worker is 1 (out of 1 available) 15406 1726854974.39656: exiting _queue_task() for managed_node2/include_tasks 15406 1726854974.39669: done queuing things up, now waiting for results queue to drain 15406 1726854974.39671: waiting for pending results... 15406 1726854974.40217: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_absent.yml' 15406 1726854974.40222: in run() - task 0affcc66-ac2b-3c83-32d3-000000000077 15406 1726854974.40226: variable 'ansible_search_path' from source: unknown 15406 1726854974.40228: calling self._execute() 15406 1726854974.40244: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854974.40255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854974.40292: variable 'omit' from source: magic vars 15406 1726854974.40669: variable 'ansible_distribution_major_version' from source: facts 15406 1726854974.40691: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854974.40707: variable 'task' from source: play vars 15406 1726854974.40859: variable 'task' from source: play vars 15406 1726854974.40862: _execute() done 15406 1726854974.40865: dumping result to json 15406 1726854974.40867: done dumping result, returning 15406 1726854974.40870: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_absent.yml' [0affcc66-ac2b-3c83-32d3-000000000077] 15406 1726854974.40872: sending task result for task 0affcc66-ac2b-3c83-32d3-000000000077 15406 1726854974.40949: done sending task result for task 0affcc66-ac2b-3c83-32d3-000000000077 15406 1726854974.40952: WORKER PROCESS EXITING 15406 1726854974.40991: no more pending results, returning what we have 15406 1726854974.40999: in VariableManager get_vars() 15406 1726854974.41032: Calling all_inventory to load vars for managed_node2 15406 1726854974.41035: Calling groups_inventory to load vars for managed_node2 15406 1726854974.41039: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.41054: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.41057: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.41060: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.42655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.44241: done with get_vars() 15406 1726854974.44265: variable 'ansible_search_path' from source: unknown 15406 1726854974.44280: we have included files to process 15406 1726854974.44281: generating all_blocks data 15406 1726854974.44284: done generating all_blocks data 15406 1726854974.44285: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15406 1726854974.44286: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15406 1726854974.44290: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15406 1726854974.44405: in VariableManager get_vars() 15406 1726854974.44422: done with get_vars() 15406 1726854974.44538: done processing included file 15406 1726854974.44540: iterating over new_blocks loaded from include file 15406 1726854974.44542: in VariableManager get_vars() 15406 1726854974.44553: done with get_vars() 15406 1726854974.44555: filtering new block on tags 15406 1726854974.44571: done filtering new block on tags 15406 1726854974.44573: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 15406 1726854974.44578: extending task lists for all hosts with included blocks 15406 1726854974.44615: done extending task lists 15406 1726854974.44616: done processing included files 15406 1726854974.44617: results queue empty 15406 1726854974.44618: checking for any_errors_fatal 15406 1726854974.44619: done checking for any_errors_fatal 15406 1726854974.44620: checking for max_fail_percentage 15406 1726854974.44621: done checking for max_fail_percentage 15406 1726854974.44622: checking to see if all hosts have failed and the running result is not ok 15406 1726854974.44622: done checking to see if all hosts have failed 15406 1726854974.44623: getting the remaining hosts for this loop 15406 1726854974.44624: done getting the remaining hosts for this loop 15406 1726854974.44627: getting the next task for host managed_node2 15406 1726854974.44631: done getting next task for host managed_node2 15406 1726854974.44633: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15406 1726854974.44636: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854974.44638: getting variables 15406 1726854974.44639: in VariableManager get_vars() 15406 1726854974.44647: Calling all_inventory to load vars for managed_node2 15406 1726854974.44649: Calling groups_inventory to load vars for managed_node2 15406 1726854974.44652: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.44657: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.44659: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.44662: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.49977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.51582: done with get_vars() 15406 1726854974.51613: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:56:14 -0400 (0:00:00.123) 0:00:42.339 ****** 15406 1726854974.51683: entering _queue_task() for managed_node2/include_tasks 15406 1726854974.52050: worker is 1 (out of 1 available) 15406 1726854974.52063: exiting _queue_task() for managed_node2/include_tasks 15406 1726854974.52075: done queuing things up, now waiting for results queue to drain 15406 1726854974.52076: waiting for pending results... 15406 1726854974.52420: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 15406 1726854974.52503: in run() - task 0affcc66-ac2b-3c83-32d3-0000000004d6 15406 1726854974.52623: variable 'ansible_search_path' from source: unknown 15406 1726854974.52627: variable 'ansible_search_path' from source: unknown 15406 1726854974.52629: calling self._execute() 15406 1726854974.52692: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854974.52699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854974.52702: variable 'omit' from source: magic vars 15406 1726854974.53167: variable 'ansible_distribution_major_version' from source: facts 15406 1726854974.53173: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854974.53179: _execute() done 15406 1726854974.53181: dumping result to json 15406 1726854974.53184: done dumping result, returning 15406 1726854974.53190: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-3c83-32d3-0000000004d6] 15406 1726854974.53193: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004d6 15406 1726854974.53432: no more pending results, returning what we have 15406 1726854974.53437: in VariableManager get_vars() 15406 1726854974.53471: Calling all_inventory to load vars for managed_node2 15406 1726854974.53474: Calling groups_inventory to load vars for managed_node2 15406 1726854974.53478: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.53497: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.53501: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.53505: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.54103: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004d6 15406 1726854974.54107: WORKER PROCESS EXITING 15406 1726854974.55091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.56757: done with get_vars() 15406 1726854974.56776: variable 'ansible_search_path' from source: unknown 15406 1726854974.56778: variable 'ansible_search_path' from source: unknown 15406 1726854974.56786: variable 'task' from source: play vars 15406 1726854974.56899: variable 'task' from source: play vars 15406 1726854974.56933: we have included files to process 15406 1726854974.56934: generating all_blocks data 15406 1726854974.56936: done generating all_blocks data 15406 1726854974.56937: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854974.56938: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854974.56941: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15406 1726854974.57133: done processing included file 15406 1726854974.57135: iterating over new_blocks loaded from include file 15406 1726854974.57137: in VariableManager get_vars() 15406 1726854974.57149: done with get_vars() 15406 1726854974.57151: filtering new block on tags 15406 1726854974.57166: done filtering new block on tags 15406 1726854974.57173: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 15406 1726854974.57178: extending task lists for all hosts with included blocks 15406 1726854974.57284: done extending task lists 15406 1726854974.57285: done processing included files 15406 1726854974.57286: results queue empty 15406 1726854974.57288: checking for any_errors_fatal 15406 1726854974.57292: done checking for any_errors_fatal 15406 1726854974.57293: checking for max_fail_percentage 15406 1726854974.57294: done checking for max_fail_percentage 15406 1726854974.57297: checking to see if all hosts have failed and the running result is not ok 15406 1726854974.57298: done checking to see if all hosts have failed 15406 1726854974.57298: getting the remaining hosts for this loop 15406 1726854974.57300: done getting the remaining hosts for this loop 15406 1726854974.57302: getting the next task for host managed_node2 15406 1726854974.57307: done getting next task for host managed_node2 15406 1726854974.57309: ^ task is: TASK: Get stat for interface {{ interface }} 15406 1726854974.57311: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854974.57313: getting variables 15406 1726854974.57314: in VariableManager get_vars() 15406 1726854974.57323: Calling all_inventory to load vars for managed_node2 15406 1726854974.57325: Calling groups_inventory to load vars for managed_node2 15406 1726854974.57327: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854974.57332: Calling all_plugins_play to load vars for managed_node2 15406 1726854974.57335: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854974.57337: Calling groups_plugins_play to load vars for managed_node2 15406 1726854974.58863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854974.60513: done with get_vars() 15406 1726854974.60533: done getting variables 15406 1726854974.60660: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:56:14 -0400 (0:00:00.090) 0:00:42.429 ****** 15406 1726854974.60692: entering _queue_task() for managed_node2/stat 15406 1726854974.61126: worker is 1 (out of 1 available) 15406 1726854974.61140: exiting _queue_task() for managed_node2/stat 15406 1726854974.61152: done queuing things up, now waiting for results queue to drain 15406 1726854974.61154: waiting for pending results... 15406 1726854974.62106: running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 15406 1726854974.62111: in run() - task 0affcc66-ac2b-3c83-32d3-0000000004e1 15406 1726854974.62114: variable 'ansible_search_path' from source: unknown 15406 1726854974.62117: variable 'ansible_search_path' from source: unknown 15406 1726854974.62120: calling self._execute() 15406 1726854974.62276: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854974.62497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854974.62502: variable 'omit' from source: magic vars 15406 1726854974.63294: variable 'ansible_distribution_major_version' from source: facts 15406 1726854974.63298: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854974.63300: variable 'omit' from source: magic vars 15406 1726854974.63303: variable 'omit' from source: magic vars 15406 1726854974.63424: variable 'interface' from source: set_fact 15406 1726854974.63448: variable 'omit' from source: magic vars 15406 1726854974.63794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854974.63798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854974.63800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854974.63803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854974.63806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854974.63809: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854974.63812: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854974.63814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854974.63942: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854974.64037: Set connection var ansible_timeout to 10 15406 1726854974.64045: Set connection var ansible_connection to ssh 15406 1726854974.64059: Set connection var ansible_shell_type to sh 15406 1726854974.64170: Set connection var ansible_shell_executable to /bin/sh 15406 1726854974.64183: Set connection var ansible_pipelining to False 15406 1726854974.64214: variable 'ansible_shell_executable' from source: unknown 15406 1726854974.64223: variable 'ansible_connection' from source: unknown 15406 1726854974.64231: variable 'ansible_module_compression' from source: unknown 15406 1726854974.64237: variable 'ansible_shell_type' from source: unknown 15406 1726854974.64243: variable 'ansible_shell_executable' from source: unknown 15406 1726854974.64274: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854974.64284: variable 'ansible_pipelining' from source: unknown 15406 1726854974.64296: variable 'ansible_timeout' from source: unknown 15406 1726854974.64306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854974.64593: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15406 1726854974.64615: variable 'omit' from source: magic vars 15406 1726854974.64627: starting attempt loop 15406 1726854974.64635: running the handler 15406 1726854974.64709: _low_level_execute_command(): starting 15406 1726854974.64712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854974.65381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854974.65405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854974.65424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854974.65444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854974.65480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration <<< 15406 1726854974.65579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854974.65605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.65821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.67536: stdout chunk (state=3): >>>/root <<< 15406 1726854974.67696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854974.67700: stdout chunk (state=3): >>><<< 15406 1726854974.67703: stderr chunk (state=3): >>><<< 15406 1726854974.67826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854974.67830: _low_level_execute_command(): starting 15406 1726854974.67833: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679 `" && echo ansible-tmp-1726854974.67727-17304-3762060321679="` echo /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679 `" ) && sleep 0' 15406 1726854974.68373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854974.68391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854974.68409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854974.68434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854974.68505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.68561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854974.68582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854974.68608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.68705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.70624: stdout chunk (state=3): >>>ansible-tmp-1726854974.67727-17304-3762060321679=/root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679 <<< 15406 1726854974.70781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854974.70785: stdout chunk (state=3): >>><<< 15406 1726854974.70789: stderr chunk (state=3): >>><<< 15406 1726854974.70808: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854974.67727-17304-3762060321679=/root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854974.70891: variable 'ansible_module_compression' from source: unknown 15406 1726854974.70931: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15406 1726854974.70975: variable 'ansible_facts' from source: unknown 15406 1726854974.71152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py 15406 1726854974.71295: Sending initial data 15406 1726854974.71298: Sent initial data (149 bytes) 15406 1726854974.71848: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854974.71862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854974.71877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854974.71928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.72002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854974.72021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854974.72047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.72324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.73867: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854974.74002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854974.74084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp2ocjhz69 /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py <<< 15406 1726854974.74090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py" <<< 15406 1726854974.74133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmp2ocjhz69" to remote "/root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py" <<< 15406 1726854974.75996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854974.76000: stdout chunk (state=3): >>><<< 15406 1726854974.76002: stderr chunk (state=3): >>><<< 15406 1726854974.76004: done transferring module to remote 15406 1726854974.76007: _low_level_execute_command(): starting 15406 1726854974.76009: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/ /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py && sleep 0' 15406 1726854974.77135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854974.77138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854974.77140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.77143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854974.77145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.77273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854974.77352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854974.77380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.77506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.79419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854974.79423: stdout chunk (state=3): >>><<< 15406 1726854974.79425: stderr chunk (state=3): >>><<< 15406 1726854974.79602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854974.79606: _low_level_execute_command(): starting 15406 1726854974.79608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/AnsiballZ_stat.py && sleep 0' 15406 1726854974.80803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.80806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854974.81020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.81130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854974.96070: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15406 1726854974.97425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854974.97431: stdout chunk (state=3): >>><<< 15406 1726854974.97434: stderr chunk (state=3): >>><<< 15406 1726854974.97662: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854974.97666: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854974.97669: _low_level_execute_command(): starting 15406 1726854974.97671: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854974.67727-17304-3762060321679/ > /dev/null 2>&1 && sleep 0' 15406 1726854974.98590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854974.98613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854974.98653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854974.98752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854974.98907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854974.98943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854974.99048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854975.00901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854975.00916: stdout chunk (state=3): >>><<< 15406 1726854975.00928: stderr chunk (state=3): >>><<< 15406 1726854975.00951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854975.00971: handler run complete 15406 1726854975.00999: attempt loop complete, returning result 15406 1726854975.01007: _execute() done 15406 1726854975.01015: dumping result to json 15406 1726854975.01045: done dumping result, returning 15406 1726854975.01048: done running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 [0affcc66-ac2b-3c83-32d3-0000000004e1] 15406 1726854975.01050: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004e1 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15406 1726854975.01250: no more pending results, returning what we have 15406 1726854975.01254: results queue empty 15406 1726854975.01255: checking for any_errors_fatal 15406 1726854975.01257: done checking for any_errors_fatal 15406 1726854975.01257: checking for max_fail_percentage 15406 1726854975.01259: done checking for max_fail_percentage 15406 1726854975.01260: checking to see if all hosts have failed and the running result is not ok 15406 1726854975.01261: done checking to see if all hosts have failed 15406 1726854975.01262: getting the remaining hosts for this loop 15406 1726854975.01263: done getting the remaining hosts for this loop 15406 1726854975.01267: getting the next task for host managed_node2 15406 1726854975.01275: done getting next task for host managed_node2 15406 1726854975.01278: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15406 1726854975.01281: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854975.01286: getting variables 15406 1726854975.01289: in VariableManager get_vars() 15406 1726854975.01320: Calling all_inventory to load vars for managed_node2 15406 1726854975.01323: Calling groups_inventory to load vars for managed_node2 15406 1726854975.01326: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854975.01338: Calling all_plugins_play to load vars for managed_node2 15406 1726854975.01341: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854975.01344: Calling groups_plugins_play to load vars for managed_node2 15406 1726854975.02103: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004e1 15406 1726854975.02106: WORKER PROCESS EXITING 15406 1726854975.03478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854975.05650: done with get_vars() 15406 1726854975.05683: done getting variables 15406 1726854975.05866: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15406 1726854975.06216: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:56:15 -0400 (0:00:00.455) 0:00:42.885 ****** 15406 1726854975.06247: entering _queue_task() for managed_node2/assert 15406 1726854975.06851: worker is 1 (out of 1 available) 15406 1726854975.06866: exiting _queue_task() for managed_node2/assert 15406 1726854975.06880: done queuing things up, now waiting for results queue to drain 15406 1726854975.06882: waiting for pending results... 15406 1726854975.07302: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15406 1726854975.07634: in run() - task 0affcc66-ac2b-3c83-32d3-0000000004d7 15406 1726854975.07821: variable 'ansible_search_path' from source: unknown 15406 1726854975.07836: variable 'ansible_search_path' from source: unknown 15406 1726854975.07989: calling self._execute() 15406 1726854975.08312: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854975.08360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854975.08412: variable 'omit' from source: magic vars 15406 1726854975.09233: variable 'ansible_distribution_major_version' from source: facts 15406 1726854975.09273: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854975.09285: variable 'omit' from source: magic vars 15406 1726854975.09415: variable 'omit' from source: magic vars 15406 1726854975.09589: variable 'interface' from source: set_fact 15406 1726854975.09640: variable 'omit' from source: magic vars 15406 1726854975.09774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854975.09794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854975.09819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854975.09841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854975.09908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854975.09946: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854975.09955: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854975.09963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854975.10110: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854975.10113: Set connection var ansible_timeout to 10 15406 1726854975.10116: Set connection var ansible_connection to ssh 15406 1726854975.10118: Set connection var ansible_shell_type to sh 15406 1726854975.10121: Set connection var ansible_shell_executable to /bin/sh 15406 1726854975.10123: Set connection var ansible_pipelining to False 15406 1726854975.10150: variable 'ansible_shell_executable' from source: unknown 15406 1726854975.10160: variable 'ansible_connection' from source: unknown 15406 1726854975.10167: variable 'ansible_module_compression' from source: unknown 15406 1726854975.10233: variable 'ansible_shell_type' from source: unknown 15406 1726854975.10236: variable 'ansible_shell_executable' from source: unknown 15406 1726854975.10238: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854975.10242: variable 'ansible_pipelining' from source: unknown 15406 1726854975.10244: variable 'ansible_timeout' from source: unknown 15406 1726854975.10246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854975.10702: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854975.10711: variable 'omit' from source: magic vars 15406 1726854975.10714: starting attempt loop 15406 1726854975.10717: running the handler 15406 1726854975.10852: variable 'interface_stat' from source: set_fact 15406 1726854975.10876: Evaluated conditional (not interface_stat.stat.exists): True 15406 1726854975.10894: handler run complete 15406 1726854975.10918: attempt loop complete, returning result 15406 1726854975.10926: _execute() done 15406 1726854975.10932: dumping result to json 15406 1726854975.10939: done dumping result, returning 15406 1726854975.10956: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affcc66-ac2b-3c83-32d3-0000000004d7] 15406 1726854975.10966: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004d7 15406 1726854975.11323: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004d7 15406 1726854975.11326: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15406 1726854975.11369: no more pending results, returning what we have 15406 1726854975.11372: results queue empty 15406 1726854975.11373: checking for any_errors_fatal 15406 1726854975.11381: done checking for any_errors_fatal 15406 1726854975.11382: checking for max_fail_percentage 15406 1726854975.11383: done checking for max_fail_percentage 15406 1726854975.11384: checking to see if all hosts have failed and the running result is not ok 15406 1726854975.11385: done checking to see if all hosts have failed 15406 1726854975.11386: getting the remaining hosts for this loop 15406 1726854975.11389: done getting the remaining hosts for this loop 15406 1726854975.11392: getting the next task for host managed_node2 15406 1726854975.11404: done getting next task for host managed_node2 15406 1726854975.11406: ^ task is: TASK: meta (flush_handlers) 15406 1726854975.11408: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854975.11412: getting variables 15406 1726854975.11413: in VariableManager get_vars() 15406 1726854975.11444: Calling all_inventory to load vars for managed_node2 15406 1726854975.11446: Calling groups_inventory to load vars for managed_node2 15406 1726854975.11450: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854975.11461: Calling all_plugins_play to load vars for managed_node2 15406 1726854975.11465: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854975.11467: Calling groups_plugins_play to load vars for managed_node2 15406 1726854975.12999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854975.16233: done with get_vars() 15406 1726854975.16258: done getting variables 15406 1726854975.16327: in VariableManager get_vars() 15406 1726854975.16337: Calling all_inventory to load vars for managed_node2 15406 1726854975.16340: Calling groups_inventory to load vars for managed_node2 15406 1726854975.16342: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854975.16347: Calling all_plugins_play to load vars for managed_node2 15406 1726854975.16349: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854975.16352: Calling groups_plugins_play to load vars for managed_node2 15406 1726854975.17940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854975.19900: done with get_vars() 15406 1726854975.19932: done queuing things up, now waiting for results queue to drain 15406 1726854975.19934: results queue empty 15406 1726854975.19935: checking for any_errors_fatal 15406 1726854975.19937: done checking for any_errors_fatal 15406 1726854975.19938: checking for max_fail_percentage 15406 1726854975.19940: done checking for max_fail_percentage 15406 1726854975.19940: checking to see if all hosts have failed and the running result is not ok 15406 1726854975.19941: done checking to see if all hosts have failed 15406 1726854975.19948: getting the remaining hosts for this loop 15406 1726854975.19949: done getting the remaining hosts for this loop 15406 1726854975.19951: getting the next task for host managed_node2 15406 1726854975.19956: done getting next task for host managed_node2 15406 1726854975.19957: ^ task is: TASK: meta (flush_handlers) 15406 1726854975.19959: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854975.19961: getting variables 15406 1726854975.19962: in VariableManager get_vars() 15406 1726854975.19971: Calling all_inventory to load vars for managed_node2 15406 1726854975.19973: Calling groups_inventory to load vars for managed_node2 15406 1726854975.19975: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854975.19981: Calling all_plugins_play to load vars for managed_node2 15406 1726854975.19983: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854975.19986: Calling groups_plugins_play to load vars for managed_node2 15406 1726854975.21984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854975.24653: done with get_vars() 15406 1726854975.24675: done getting variables 15406 1726854975.24728: in VariableManager get_vars() 15406 1726854975.24737: Calling all_inventory to load vars for managed_node2 15406 1726854975.24739: Calling groups_inventory to load vars for managed_node2 15406 1726854975.24741: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854975.24746: Calling all_plugins_play to load vars for managed_node2 15406 1726854975.24748: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854975.24751: Calling groups_plugins_play to load vars for managed_node2 15406 1726854975.26414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854975.28686: done with get_vars() 15406 1726854975.28722: done queuing things up, now waiting for results queue to drain 15406 1726854975.28725: results queue empty 15406 1726854975.28725: checking for any_errors_fatal 15406 1726854975.28727: done checking for any_errors_fatal 15406 1726854975.28728: checking for max_fail_percentage 15406 1726854975.28729: done checking for max_fail_percentage 15406 1726854975.28729: checking to see if all hosts have failed and the running result is not ok 15406 1726854975.28730: done checking to see if all hosts have failed 15406 1726854975.28731: getting the remaining hosts for this loop 15406 1726854975.28732: done getting the remaining hosts for this loop 15406 1726854975.28735: getting the next task for host managed_node2 15406 1726854975.28738: done getting next task for host managed_node2 15406 1726854975.28739: ^ task is: None 15406 1726854975.28741: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854975.28742: done queuing things up, now waiting for results queue to drain 15406 1726854975.28743: results queue empty 15406 1726854975.28744: checking for any_errors_fatal 15406 1726854975.28744: done checking for any_errors_fatal 15406 1726854975.28745: checking for max_fail_percentage 15406 1726854975.28746: done checking for max_fail_percentage 15406 1726854975.28746: checking to see if all hosts have failed and the running result is not ok 15406 1726854975.28747: done checking to see if all hosts have failed 15406 1726854975.28748: getting the next task for host managed_node2 15406 1726854975.28750: done getting next task for host managed_node2 15406 1726854975.28751: ^ task is: None 15406 1726854975.28752: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854975.28815: in VariableManager get_vars() 15406 1726854975.28833: done with get_vars() 15406 1726854975.28839: in VariableManager get_vars() 15406 1726854975.28872: done with get_vars() 15406 1726854975.28877: variable 'omit' from source: magic vars 15406 1726854975.28912: in VariableManager get_vars() 15406 1726854975.28921: done with get_vars() 15406 1726854975.28944: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15406 1726854975.29303: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15406 1726854975.29347: getting the remaining hosts for this loop 15406 1726854975.29348: done getting the remaining hosts for this loop 15406 1726854975.29351: getting the next task for host managed_node2 15406 1726854975.29354: done getting next task for host managed_node2 15406 1726854975.29356: ^ task is: TASK: Gathering Facts 15406 1726854975.29357: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854975.29359: getting variables 15406 1726854975.29360: in VariableManager get_vars() 15406 1726854975.29369: Calling all_inventory to load vars for managed_node2 15406 1726854975.29371: Calling groups_inventory to load vars for managed_node2 15406 1726854975.29373: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854975.29379: Calling all_plugins_play to load vars for managed_node2 15406 1726854975.29381: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854975.29384: Calling groups_plugins_play to load vars for managed_node2 15406 1726854975.30644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854975.32748: done with get_vars() 15406 1726854975.32767: done getting variables 15406 1726854975.32813: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Friday 20 September 2024 13:56:15 -0400 (0:00:00.265) 0:00:43.151 ****** 15406 1726854975.32839: entering _queue_task() for managed_node2/gather_facts 15406 1726854975.33628: worker is 1 (out of 1 available) 15406 1726854975.33642: exiting _queue_task() for managed_node2/gather_facts 15406 1726854975.33652: done queuing things up, now waiting for results queue to drain 15406 1726854975.33657: waiting for pending results... 15406 1726854975.33791: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15406 1726854975.33993: in run() - task 0affcc66-ac2b-3c83-32d3-0000000004fa 15406 1726854975.34046: variable 'ansible_search_path' from source: unknown 15406 1726854975.34111: calling self._execute() 15406 1726854975.34180: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854975.34195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854975.34390: variable 'omit' from source: magic vars 15406 1726854975.34853: variable 'ansible_distribution_major_version' from source: facts 15406 1726854975.34871: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854975.34909: variable 'omit' from source: magic vars 15406 1726854975.34948: variable 'omit' from source: magic vars 15406 1726854975.35013: variable 'omit' from source: magic vars 15406 1726854975.35131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854975.35178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854975.35208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854975.35229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854975.35276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854975.35318: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854975.35327: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854975.35335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854975.35447: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854975.35460: Set connection var ansible_timeout to 10 15406 1726854975.35467: Set connection var ansible_connection to ssh 15406 1726854975.35484: Set connection var ansible_shell_type to sh 15406 1726854975.35593: Set connection var ansible_shell_executable to /bin/sh 15406 1726854975.35600: Set connection var ansible_pipelining to False 15406 1726854975.35602: variable 'ansible_shell_executable' from source: unknown 15406 1726854975.35604: variable 'ansible_connection' from source: unknown 15406 1726854975.35606: variable 'ansible_module_compression' from source: unknown 15406 1726854975.35609: variable 'ansible_shell_type' from source: unknown 15406 1726854975.35611: variable 'ansible_shell_executable' from source: unknown 15406 1726854975.35613: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854975.35616: variable 'ansible_pipelining' from source: unknown 15406 1726854975.35618: variable 'ansible_timeout' from source: unknown 15406 1726854975.35620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854975.35844: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854975.35860: variable 'omit' from source: magic vars 15406 1726854975.35869: starting attempt loop 15406 1726854975.35877: running the handler 15406 1726854975.35903: variable 'ansible_facts' from source: unknown 15406 1726854975.35933: _low_level_execute_command(): starting 15406 1726854975.35946: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854975.36811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854975.36854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854975.36874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854975.36914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854975.37026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854975.38740: stdout chunk (state=3): >>>/root <<< 15406 1726854975.38897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854975.38901: stdout chunk (state=3): >>><<< 15406 1726854975.38904: stderr chunk (state=3): >>><<< 15406 1726854975.38925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854975.39027: _low_level_execute_command(): starting 15406 1726854975.39030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178 `" && echo ansible-tmp-1726854975.3893497-17330-56162333410178="` echo /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178 `" ) && sleep 0' 15406 1726854975.39603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854975.39623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854975.39643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854975.39663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854975.39751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854975.39792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854975.39819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854975.39854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854975.39950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854975.41903: stdout chunk (state=3): >>>ansible-tmp-1726854975.3893497-17330-56162333410178=/root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178 <<< 15406 1726854975.42008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854975.42034: stderr chunk (state=3): >>><<< 15406 1726854975.42037: stdout chunk (state=3): >>><<< 15406 1726854975.42053: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854975.3893497-17330-56162333410178=/root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854975.42077: variable 'ansible_module_compression' from source: unknown 15406 1726854975.42125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15406 1726854975.42176: variable 'ansible_facts' from source: unknown 15406 1726854975.42392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py 15406 1726854975.42576: Sending initial data 15406 1726854975.42579: Sent initial data (153 bytes) 15406 1726854975.43061: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854975.43108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854975.43171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854975.43195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854975.43275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854975.44840: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15406 1726854975.44844: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854975.44907: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854975.44979: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpsj4mp1_q /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py <<< 15406 1726854975.44982: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py" <<< 15406 1726854975.45053: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpsj4mp1_q" to remote "/root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py" <<< 15406 1726854975.45057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py" <<< 15406 1726854975.46803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854975.46807: stdout chunk (state=3): >>><<< 15406 1726854975.46809: stderr chunk (state=3): >>><<< 15406 1726854975.46811: done transferring module to remote 15406 1726854975.46813: _low_level_execute_command(): starting 15406 1726854975.46816: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/ /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py && sleep 0' 15406 1726854975.47389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854975.47408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854975.47431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854975.47449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854975.47553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854975.47565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854975.47580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854975.47604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854975.47705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854975.49547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854975.49565: stdout chunk (state=3): >>><<< 15406 1726854975.49576: stderr chunk (state=3): >>><<< 15406 1726854975.49603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854975.49612: _low_level_execute_command(): starting 15406 1726854975.49693: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/AnsiballZ_setup.py && sleep 0' 15406 1726854975.50262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854975.50277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854975.50359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854975.50398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854975.50415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854975.50434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854975.50550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.12952: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.2783203125, "5m": 0.33056640625, "15m": 0.1748046875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentati<<< 15406 1726854976.12969: stdout chunk (state=3): >>>on": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 759, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795033088, "block_size": 4096, "block_total": 65519099, "block_available": 63914803, "block_used": 1604296, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "16", "epoch": "1726854976", "epoch_int": "1726854976", "date": "2024-09-20", "time": "13:56:16", "iso8601_micro": "2024-09-20T17:56:16.125176Z", "iso8601": "2024-09-20T17:56:16Z", "iso8601_basic": "20240920T135616125176", "iso8601_basic_short": "20240920T135616", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15406 1726854976.14975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.15011: stderr chunk (state=3): >>>Shared connection to 10.31.45.178 closed. <<< 15406 1726854976.15015: stdout chunk (state=3): >>><<< 15406 1726854976.15017: stderr chunk (state=3): >>><<< 15406 1726854976.15195: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-178", "ansible_nodename": "ip-10-31-45-178.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22ce5a950c0e1daed0e34b12956afd", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.2783203125, "5m": 0.33056640625, "15m": 0.1748046875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDYHoDdxLMjhb3WXz7Kq3V961IHxG5ZVnpxhJyUuRxKvuRuE1ZM7WgmeCFZiB2yUGkjU5KZWqFIyeov31ImCb6tvCTgXSA9OPfIRgT4jzdz5In55fdJk7XhiXWLqKZ94cJO7xrcX1+yO6y3Hlyh9LgMHqf0CAPkn5q16dPoFGGIbFVoQ1LdJKNCDA8hE1PP+6CCJHRE9Kbj69yKuOBCyw3zo/yjJDjpLcgrl3vuDYHsBrv91QDMAVNNvCjE4J3EXtBBOu6PUeSrtTs03pc7B0oLtWysVnP+p0F7/BqDejMqif9kPiH4IGmSi2f0Jk+WH4Ah72zQTFgNHD8Yn5Iqv4tvKz+h29mEVstZcCH6Uda3gkSMA69twRNXUiX42/mqGqvrRg9ABRLvvIlT0CpzCgoMA6wvahdbpDh7KwyUGQIgRMHPA0+lFJ4V8cHdI5nxB11tYT+kvRmx4YA384P3r+jDYyvZGAowpdh1r1axM00gb5wfH3pvP70nLhKmgbMp5hk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpkmzWjA0oDcG6Twm1W1xtghk2EK23zQ0AwXc3wxW8e9GSbjP1DWoY0gkaEC8fhSY62JvXuoDJTdWUki0p72Uk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDd5W6SFs3raD6kVJbYQo0u5/b2r4LrItLQI56hof2x+", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d8:4cff:fefa:7a71", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.178", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:d8:4c:fa:7a:71", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.178"], "ansible_all_ipv6_addresses": ["fe80::d8:4cff:fefa:7a71"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.178", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d8:4cff:fefa:7a71"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_uuid": "ec22ce5a-950c-0e1d-aed0-e34b12956afd", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 759, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795033088, "block_size": 4096, "block_total": 65519099, "block_available": 63914803, "block_used": 1604296, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "56", "second": "16", "epoch": "1726854976", "epoch_int": "1726854976", "date": "2024-09-20", "time": "13:56:16", "iso8601_micro": "2024-09-20T17:56:16.125176Z", "iso8601": "2024-09-20T17:56:16Z", "iso8601_basic": "20240920T135616125176", "iso8601_basic_short": "20240920T135616", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 51492 10.31.45.178 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 51492 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854976.15550: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854976.15577: _low_level_execute_command(): starting 15406 1726854976.15590: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854975.3893497-17330-56162333410178/ > /dev/null 2>&1 && sleep 0' 15406 1726854976.16175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854976.16201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854976.16216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.16305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.16334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.16349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.16373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.16516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.18381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.18444: stderr chunk (state=3): >>><<< 15406 1726854976.18455: stdout chunk (state=3): >>><<< 15406 1726854976.18490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.18508: handler run complete 15406 1726854976.18699: variable 'ansible_facts' from source: unknown 15406 1726854976.18732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.19072: variable 'ansible_facts' from source: unknown 15406 1726854976.19174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.19315: attempt loop complete, returning result 15406 1726854976.19325: _execute() done 15406 1726854976.19333: dumping result to json 15406 1726854976.19373: done dumping result, returning 15406 1726854976.19386: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcc66-ac2b-3c83-32d3-0000000004fa] 15406 1726854976.19399: sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004fa ok: [managed_node2] 15406 1726854976.20500: no more pending results, returning what we have 15406 1726854976.20504: results queue empty 15406 1726854976.20505: checking for any_errors_fatal 15406 1726854976.20507: done checking for any_errors_fatal 15406 1726854976.20507: checking for max_fail_percentage 15406 1726854976.20509: done checking for max_fail_percentage 15406 1726854976.20510: checking to see if all hosts have failed and the running result is not ok 15406 1726854976.20510: done checking to see if all hosts have failed 15406 1726854976.20511: getting the remaining hosts for this loop 15406 1726854976.20512: done getting the remaining hosts for this loop 15406 1726854976.20515: getting the next task for host managed_node2 15406 1726854976.20520: done getting next task for host managed_node2 15406 1726854976.20522: ^ task is: TASK: meta (flush_handlers) 15406 1726854976.20523: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854976.20527: getting variables 15406 1726854976.20528: in VariableManager get_vars() 15406 1726854976.20549: Calling all_inventory to load vars for managed_node2 15406 1726854976.20552: Calling groups_inventory to load vars for managed_node2 15406 1726854976.20556: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854976.20561: done sending task result for task 0affcc66-ac2b-3c83-32d3-0000000004fa 15406 1726854976.20564: WORKER PROCESS EXITING 15406 1726854976.20574: Calling all_plugins_play to load vars for managed_node2 15406 1726854976.20577: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854976.20580: Calling groups_plugins_play to load vars for managed_node2 15406 1726854976.21884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.23484: done with get_vars() 15406 1726854976.23512: done getting variables 15406 1726854976.23584: in VariableManager get_vars() 15406 1726854976.23597: Calling all_inventory to load vars for managed_node2 15406 1726854976.23599: Calling groups_inventory to load vars for managed_node2 15406 1726854976.23602: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854976.23607: Calling all_plugins_play to load vars for managed_node2 15406 1726854976.23610: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854976.23613: Calling groups_plugins_play to load vars for managed_node2 15406 1726854976.24873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.26451: done with get_vars() 15406 1726854976.26484: done queuing things up, now waiting for results queue to drain 15406 1726854976.26486: results queue empty 15406 1726854976.26491: checking for any_errors_fatal 15406 1726854976.26495: done checking for any_errors_fatal 15406 1726854976.26496: checking for max_fail_percentage 15406 1726854976.26502: done checking for max_fail_percentage 15406 1726854976.26503: checking to see if all hosts have failed and the running result is not ok 15406 1726854976.26504: done checking to see if all hosts have failed 15406 1726854976.26504: getting the remaining hosts for this loop 15406 1726854976.26505: done getting the remaining hosts for this loop 15406 1726854976.26508: getting the next task for host managed_node2 15406 1726854976.26513: done getting next task for host managed_node2 15406 1726854976.26516: ^ task is: TASK: Verify network state restored to default 15406 1726854976.26518: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854976.26520: getting variables 15406 1726854976.26521: in VariableManager get_vars() 15406 1726854976.26529: Calling all_inventory to load vars for managed_node2 15406 1726854976.26532: Calling groups_inventory to load vars for managed_node2 15406 1726854976.26534: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854976.26539: Calling all_plugins_play to load vars for managed_node2 15406 1726854976.26542: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854976.26545: Calling groups_plugins_play to load vars for managed_node2 15406 1726854976.27721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.29276: done with get_vars() 15406 1726854976.29304: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Friday 20 September 2024 13:56:16 -0400 (0:00:00.965) 0:00:44.116 ****** 15406 1726854976.29380: entering _queue_task() for managed_node2/include_tasks 15406 1726854976.29818: worker is 1 (out of 1 available) 15406 1726854976.29830: exiting _queue_task() for managed_node2/include_tasks 15406 1726854976.29841: done queuing things up, now waiting for results queue to drain 15406 1726854976.29842: waiting for pending results... 15406 1726854976.30051: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 15406 1726854976.30155: in run() - task 0affcc66-ac2b-3c83-32d3-00000000007a 15406 1726854976.30180: variable 'ansible_search_path' from source: unknown 15406 1726854976.30220: calling self._execute() 15406 1726854976.30329: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.30341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.30356: variable 'omit' from source: magic vars 15406 1726854976.30760: variable 'ansible_distribution_major_version' from source: facts 15406 1726854976.30777: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854976.30789: _execute() done 15406 1726854976.30796: dumping result to json 15406 1726854976.30803: done dumping result, returning 15406 1726854976.30811: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0affcc66-ac2b-3c83-32d3-00000000007a] 15406 1726854976.30995: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000007a 15406 1726854976.31069: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000007a 15406 1726854976.31073: WORKER PROCESS EXITING 15406 1726854976.31105: no more pending results, returning what we have 15406 1726854976.31111: in VariableManager get_vars() 15406 1726854976.31145: Calling all_inventory to load vars for managed_node2 15406 1726854976.31147: Calling groups_inventory to load vars for managed_node2 15406 1726854976.31151: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854976.31165: Calling all_plugins_play to load vars for managed_node2 15406 1726854976.31168: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854976.31171: Calling groups_plugins_play to load vars for managed_node2 15406 1726854976.32710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.34359: done with get_vars() 15406 1726854976.34378: variable 'ansible_search_path' from source: unknown 15406 1726854976.34393: we have included files to process 15406 1726854976.34397: generating all_blocks data 15406 1726854976.34398: done generating all_blocks data 15406 1726854976.34399: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15406 1726854976.34400: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15406 1726854976.34403: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15406 1726854976.34817: done processing included file 15406 1726854976.34819: iterating over new_blocks loaded from include file 15406 1726854976.34821: in VariableManager get_vars() 15406 1726854976.34831: done with get_vars() 15406 1726854976.34832: filtering new block on tags 15406 1726854976.34846: done filtering new block on tags 15406 1726854976.34848: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 15406 1726854976.34852: extending task lists for all hosts with included blocks 15406 1726854976.34876: done extending task lists 15406 1726854976.34877: done processing included files 15406 1726854976.34877: results queue empty 15406 1726854976.34878: checking for any_errors_fatal 15406 1726854976.34879: done checking for any_errors_fatal 15406 1726854976.34880: checking for max_fail_percentage 15406 1726854976.34881: done checking for max_fail_percentage 15406 1726854976.34881: checking to see if all hosts have failed and the running result is not ok 15406 1726854976.34882: done checking to see if all hosts have failed 15406 1726854976.34883: getting the remaining hosts for this loop 15406 1726854976.34884: done getting the remaining hosts for this loop 15406 1726854976.34886: getting the next task for host managed_node2 15406 1726854976.34896: done getting next task for host managed_node2 15406 1726854976.34899: ^ task is: TASK: Check routes and DNS 15406 1726854976.34901: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854976.34903: getting variables 15406 1726854976.34904: in VariableManager get_vars() 15406 1726854976.34911: Calling all_inventory to load vars for managed_node2 15406 1726854976.34913: Calling groups_inventory to load vars for managed_node2 15406 1726854976.34915: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854976.34919: Calling all_plugins_play to load vars for managed_node2 15406 1726854976.34921: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854976.34923: Calling groups_plugins_play to load vars for managed_node2 15406 1726854976.37013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.39677: done with get_vars() 15406 1726854976.39702: done getting variables 15406 1726854976.39747: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:56:16 -0400 (0:00:00.103) 0:00:44.220 ****** 15406 1726854976.39780: entering _queue_task() for managed_node2/shell 15406 1726854976.40116: worker is 1 (out of 1 available) 15406 1726854976.40128: exiting _queue_task() for managed_node2/shell 15406 1726854976.40139: done queuing things up, now waiting for results queue to drain 15406 1726854976.40141: waiting for pending results... 15406 1726854976.40518: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 15406 1726854976.40547: in run() - task 0affcc66-ac2b-3c83-32d3-00000000050b 15406 1726854976.40563: variable 'ansible_search_path' from source: unknown 15406 1726854976.40571: variable 'ansible_search_path' from source: unknown 15406 1726854976.40615: calling self._execute() 15406 1726854976.40732: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.40743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.40759: variable 'omit' from source: magic vars 15406 1726854976.41168: variable 'ansible_distribution_major_version' from source: facts 15406 1726854976.41196: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854976.41210: variable 'omit' from source: magic vars 15406 1726854976.41288: variable 'omit' from source: magic vars 15406 1726854976.41291: variable 'omit' from source: magic vars 15406 1726854976.41327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854976.41362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854976.41384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854976.41410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854976.41427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854976.41503: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854976.41506: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.41508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.41592: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854976.41608: Set connection var ansible_timeout to 10 15406 1726854976.41692: Set connection var ansible_connection to ssh 15406 1726854976.41696: Set connection var ansible_shell_type to sh 15406 1726854976.41699: Set connection var ansible_shell_executable to /bin/sh 15406 1726854976.41702: Set connection var ansible_pipelining to False 15406 1726854976.41704: variable 'ansible_shell_executable' from source: unknown 15406 1726854976.41706: variable 'ansible_connection' from source: unknown 15406 1726854976.41709: variable 'ansible_module_compression' from source: unknown 15406 1726854976.41711: variable 'ansible_shell_type' from source: unknown 15406 1726854976.41713: variable 'ansible_shell_executable' from source: unknown 15406 1726854976.41714: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.41716: variable 'ansible_pipelining' from source: unknown 15406 1726854976.41720: variable 'ansible_timeout' from source: unknown 15406 1726854976.41722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.41871: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854976.41892: variable 'omit' from source: magic vars 15406 1726854976.41903: starting attempt loop 15406 1726854976.41910: running the handler 15406 1726854976.41946: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854976.41951: _low_level_execute_command(): starting 15406 1726854976.41963: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854976.42829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.42863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.42880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.43007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.43164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.44869: stdout chunk (state=3): >>>/root <<< 15406 1726854976.45005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.45020: stdout chunk (state=3): >>><<< 15406 1726854976.45032: stderr chunk (state=3): >>><<< 15406 1726854976.45056: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.45075: _low_level_execute_command(): starting 15406 1726854976.45086: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664 `" && echo ansible-tmp-1726854976.4506261-17375-54049413387664="` echo /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664 `" ) && sleep 0' 15406 1726854976.45689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854976.45715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854976.45728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.45758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854976.45861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.45897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.45915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.45934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.46092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.48012: stdout chunk (state=3): >>>ansible-tmp-1726854976.4506261-17375-54049413387664=/root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664 <<< 15406 1726854976.48254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.48258: stdout chunk (state=3): >>><<< 15406 1726854976.48260: stderr chunk (state=3): >>><<< 15406 1726854976.48475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854976.4506261-17375-54049413387664=/root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.48479: variable 'ansible_module_compression' from source: unknown 15406 1726854976.48693: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15406 1726854976.48697: variable 'ansible_facts' from source: unknown 15406 1726854976.48822: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py 15406 1726854976.49229: Sending initial data 15406 1726854976.49233: Sent initial data (155 bytes) 15406 1726854976.50250: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854976.50375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.50390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.50407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.50460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.50575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.50898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.52318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854976.52410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854976.52508: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpz1odzg7e /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py <<< 15406 1726854976.52512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py" <<< 15406 1726854976.52659: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpz1odzg7e" to remote "/root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py" <<< 15406 1726854976.54327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.54492: stderr chunk (state=3): >>><<< 15406 1726854976.54498: stdout chunk (state=3): >>><<< 15406 1726854976.54500: done transferring module to remote 15406 1726854976.54503: _low_level_execute_command(): starting 15406 1726854976.54507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/ /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py && sleep 0' 15406 1726854976.55705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.55823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.55836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.55921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.57756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.57792: stderr chunk (state=3): >>><<< 15406 1726854976.57803: stdout chunk (state=3): >>><<< 15406 1726854976.57857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.57867: _low_level_execute_command(): starting 15406 1726854976.57876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/AnsiballZ_command.py && sleep 0' 15406 1726854976.59365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.59580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.59624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.75633: stdout chunk (state=3): >>> <<< 15406 1726854976.75681: stdout chunk (state=3): >>>{"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:d8:4c:fa:7a:71 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.178/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3376sec preferred_lft 3376sec\n inet6 fe80::d8:4cff:fefa:7a71/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.178 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.178 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:56:16.745513", "end": "2024-09-20 13:56:16.754205", "delta": "0:00:00.008692", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15406 1726854976.77124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854976.77148: stderr chunk (state=3): >>><<< 15406 1726854976.77151: stdout chunk (state=3): >>><<< 15406 1726854976.77168: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:d8:4c:fa:7a:71 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.178/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3376sec preferred_lft 3376sec\n inet6 fe80::d8:4cff:fefa:7a71/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.178 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.178 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:56:16.745513", "end": "2024-09-20 13:56:16.754205", "delta": "0:00:00.008692", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854976.77209: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854976.77216: _low_level_execute_command(): starting 15406 1726854976.77221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854976.4506261-17375-54049413387664/ > /dev/null 2>&1 && sleep 0' 15406 1726854976.77626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.77632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.77634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.77636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.77684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.77690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.77763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.79692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.79699: stdout chunk (state=3): >>><<< 15406 1726854976.79701: stderr chunk (state=3): >>><<< 15406 1726854976.79704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.79706: handler run complete 15406 1726854976.79708: Evaluated conditional (False): False 15406 1726854976.79710: attempt loop complete, returning result 15406 1726854976.79712: _execute() done 15406 1726854976.79714: dumping result to json 15406 1726854976.79716: done dumping result, returning 15406 1726854976.79717: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0affcc66-ac2b-3c83-32d3-00000000050b] 15406 1726854976.79719: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000050b 15406 1726854976.79874: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000050b 15406 1726854976.79877: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008692", "end": "2024-09-20 13:56:16.754205", "rc": 0, "start": "2024-09-20 13:56:16.745513" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:d8:4c:fa:7a:71 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.178/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3376sec preferred_lft 3376sec inet6 fe80::d8:4cff:fefa:7a71/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.178 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.178 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15406 1726854976.79959: no more pending results, returning what we have 15406 1726854976.79964: results queue empty 15406 1726854976.79965: checking for any_errors_fatal 15406 1726854976.79966: done checking for any_errors_fatal 15406 1726854976.79967: checking for max_fail_percentage 15406 1726854976.79968: done checking for max_fail_percentage 15406 1726854976.79969: checking to see if all hosts have failed and the running result is not ok 15406 1726854976.79970: done checking to see if all hosts have failed 15406 1726854976.79971: getting the remaining hosts for this loop 15406 1726854976.79972: done getting the remaining hosts for this loop 15406 1726854976.79975: getting the next task for host managed_node2 15406 1726854976.79981: done getting next task for host managed_node2 15406 1726854976.79983: ^ task is: TASK: Verify DNS and network connectivity 15406 1726854976.79986: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854976.79991: getting variables 15406 1726854976.79992: in VariableManager get_vars() 15406 1726854976.80020: Calling all_inventory to load vars for managed_node2 15406 1726854976.80022: Calling groups_inventory to load vars for managed_node2 15406 1726854976.80028: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854976.80038: Calling all_plugins_play to load vars for managed_node2 15406 1726854976.80040: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854976.80042: Calling groups_plugins_play to load vars for managed_node2 15406 1726854976.80849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854976.81855: done with get_vars() 15406 1726854976.81879: done getting variables 15406 1726854976.81947: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:56:16 -0400 (0:00:00.421) 0:00:44.642 ****** 15406 1726854976.81976: entering _queue_task() for managed_node2/shell 15406 1726854976.82285: worker is 1 (out of 1 available) 15406 1726854976.82303: exiting _queue_task() for managed_node2/shell 15406 1726854976.82315: done queuing things up, now waiting for results queue to drain 15406 1726854976.82316: waiting for pending results... 15406 1726854976.82590: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 15406 1726854976.82666: in run() - task 0affcc66-ac2b-3c83-32d3-00000000050c 15406 1726854976.82683: variable 'ansible_search_path' from source: unknown 15406 1726854976.82688: variable 'ansible_search_path' from source: unknown 15406 1726854976.82718: calling self._execute() 15406 1726854976.82803: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.82807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.82816: variable 'omit' from source: magic vars 15406 1726854976.83089: variable 'ansible_distribution_major_version' from source: facts 15406 1726854976.83102: Evaluated conditional (ansible_distribution_major_version != '6'): True 15406 1726854976.83196: variable 'ansible_facts' from source: unknown 15406 1726854976.83663: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15406 1726854976.83667: variable 'omit' from source: magic vars 15406 1726854976.83697: variable 'omit' from source: magic vars 15406 1726854976.83722: variable 'omit' from source: magic vars 15406 1726854976.83864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15406 1726854976.83868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15406 1726854976.83870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15406 1726854976.83873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854976.83875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15406 1726854976.83877: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15406 1726854976.83879: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.83882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.83922: Set connection var ansible_module_compression to ZIP_DEFLATED 15406 1726854976.83926: Set connection var ansible_timeout to 10 15406 1726854976.83929: Set connection var ansible_connection to ssh 15406 1726854976.83937: Set connection var ansible_shell_type to sh 15406 1726854976.83941: Set connection var ansible_shell_executable to /bin/sh 15406 1726854976.83948: Set connection var ansible_pipelining to False 15406 1726854976.83971: variable 'ansible_shell_executable' from source: unknown 15406 1726854976.83975: variable 'ansible_connection' from source: unknown 15406 1726854976.83977: variable 'ansible_module_compression' from source: unknown 15406 1726854976.83979: variable 'ansible_shell_type' from source: unknown 15406 1726854976.83982: variable 'ansible_shell_executable' from source: unknown 15406 1726854976.83984: variable 'ansible_host' from source: host vars for 'managed_node2' 15406 1726854976.83986: variable 'ansible_pipelining' from source: unknown 15406 1726854976.83990: variable 'ansible_timeout' from source: unknown 15406 1726854976.83993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15406 1726854976.84094: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854976.84105: variable 'omit' from source: magic vars 15406 1726854976.84110: starting attempt loop 15406 1726854976.84113: running the handler 15406 1726854976.84122: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15406 1726854976.84138: _low_level_execute_command(): starting 15406 1726854976.84145: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15406 1726854976.84654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.84657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.84661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.84663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.84719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.84774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.84845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.86442: stdout chunk (state=3): >>>/root <<< 15406 1726854976.86570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.86585: stderr chunk (state=3): >>><<< 15406 1726854976.86600: stdout chunk (state=3): >>><<< 15406 1726854976.86617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.86629: _low_level_execute_command(): starting 15406 1726854976.86642: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535 `" && echo ansible-tmp-1726854976.8661735-17404-274108845171535="` echo /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535 `" ) && sleep 0' 15406 1726854976.87408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.87421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15406 1726854976.87424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854976.87511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.87521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.87524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.87527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.87633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.89524: stdout chunk (state=3): >>>ansible-tmp-1726854976.8661735-17404-274108845171535=/root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535 <<< 15406 1726854976.89634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.89656: stderr chunk (state=3): >>><<< 15406 1726854976.89681: stdout chunk (state=3): >>><<< 15406 1726854976.89700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854976.8661735-17404-274108845171535=/root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.89723: variable 'ansible_module_compression' from source: unknown 15406 1726854976.89762: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15406j8df6f_j/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15406 1726854976.89800: variable 'ansible_facts' from source: unknown 15406 1726854976.89873: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py 15406 1726854976.90043: Sending initial data 15406 1726854976.90046: Sent initial data (156 bytes) 15406 1726854976.90520: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854976.90524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854976.90526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.90528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854976.90530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854976.90532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.90580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854976.90586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.90655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.92203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15406 1726854976.92268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15406 1726854976.92336: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpaj4rg2nl /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py <<< 15406 1726854976.92340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py" <<< 15406 1726854976.92407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15406j8df6f_j/tmpaj4rg2nl" to remote "/root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py" <<< 15406 1726854976.93112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.93184: stderr chunk (state=3): >>><<< 15406 1726854976.93190: stdout chunk (state=3): >>><<< 15406 1726854976.93214: done transferring module to remote 15406 1726854976.93223: _low_level_execute_command(): starting 15406 1726854976.93226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/ /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py && sleep 0' 15406 1726854976.93876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15406 1726854976.93879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854976.93881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.93884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15406 1726854976.93891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found <<< 15406 1726854976.93893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.93969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.94017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854976.95785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854976.95812: stderr chunk (state=3): >>><<< 15406 1726854976.95816: stdout chunk (state=3): >>><<< 15406 1726854976.95828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854976.95831: _low_level_execute_command(): starting 15406 1726854976.95836: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/AnsiballZ_command.py && sleep 0' 15406 1726854976.96259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15406 1726854976.96309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address <<< 15406 1726854976.96313: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854976.96367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854976.96374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854976.96458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854977.30796: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3008 0 --:--:-- --:--:-- --:--:-- 3019\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4065 0 --:--:-- --:--:-- --:--:-- 4098", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:56:17.110967", "end": "2024-09-20 13:56:17.305409", "delta": "0:00:00.194442", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15406 1726854977.32432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. <<< 15406 1726854977.32436: stdout chunk (state=3): >>><<< 15406 1726854977.32438: stderr chunk (state=3): >>><<< 15406 1726854977.32597: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3008 0 --:--:-- --:--:-- --:--:-- 3019\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4065 0 --:--:-- --:--:-- --:--:-- 4098", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:56:17.110967", "end": "2024-09-20 13:56:17.305409", "delta": "0:00:00.194442", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.178 closed. 15406 1726854977.32607: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15406 1726854977.32610: _low_level_execute_command(): starting 15406 1726854977.32613: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854976.8661735-17404-274108845171535/ > /dev/null 2>&1 && sleep 0' 15406 1726854977.33249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15406 1726854977.33286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found <<< 15406 1726854977.33314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854977.33401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15406 1726854977.33430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' <<< 15406 1726854977.33445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15406 1726854977.33466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15406 1726854977.33576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15406 1726854977.35479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15406 1726854977.35484: stdout chunk (state=3): >>><<< 15406 1726854977.35486: stderr chunk (state=3): >>><<< 15406 1726854977.35693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.178 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.178 originally 10.31.45.178 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a4678d9fc0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15406 1726854977.35697: handler run complete 15406 1726854977.35699: Evaluated conditional (False): False 15406 1726854977.35701: attempt loop complete, returning result 15406 1726854977.35703: _execute() done 15406 1726854977.35705: dumping result to json 15406 1726854977.35707: done dumping result, returning 15406 1726854977.35709: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0affcc66-ac2b-3c83-32d3-00000000050c] 15406 1726854977.35711: sending task result for task 0affcc66-ac2b-3c83-32d3-00000000050c 15406 1726854977.35781: done sending task result for task 0affcc66-ac2b-3c83-32d3-00000000050c 15406 1726854977.35784: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.194442", "end": "2024-09-20 13:56:17.305409", "rc": 0, "start": "2024-09-20 13:56:17.110967" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3008 0 --:--:-- --:--:-- --:--:-- 3019 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4065 0 --:--:-- --:--:-- --:--:-- 4098 15406 1726854977.35857: no more pending results, returning what we have 15406 1726854977.35860: results queue empty 15406 1726854977.35861: checking for any_errors_fatal 15406 1726854977.35870: done checking for any_errors_fatal 15406 1726854977.35871: checking for max_fail_percentage 15406 1726854977.35873: done checking for max_fail_percentage 15406 1726854977.35874: checking to see if all hosts have failed and the running result is not ok 15406 1726854977.35875: done checking to see if all hosts have failed 15406 1726854977.35875: getting the remaining hosts for this loop 15406 1726854977.35877: done getting the remaining hosts for this loop 15406 1726854977.35886: getting the next task for host managed_node2 15406 1726854977.35901: done getting next task for host managed_node2 15406 1726854977.35904: ^ task is: TASK: meta (flush_handlers) 15406 1726854977.35907: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854977.35911: getting variables 15406 1726854977.35913: in VariableManager get_vars() 15406 1726854977.35942: Calling all_inventory to load vars for managed_node2 15406 1726854977.35944: Calling groups_inventory to load vars for managed_node2 15406 1726854977.35948: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854977.35959: Calling all_plugins_play to load vars for managed_node2 15406 1726854977.35962: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854977.35965: Calling groups_plugins_play to load vars for managed_node2 15406 1726854977.37833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854977.39413: done with get_vars() 15406 1726854977.39434: done getting variables 15406 1726854977.39503: in VariableManager get_vars() 15406 1726854977.39513: Calling all_inventory to load vars for managed_node2 15406 1726854977.39515: Calling groups_inventory to load vars for managed_node2 15406 1726854977.39517: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854977.39522: Calling all_plugins_play to load vars for managed_node2 15406 1726854977.39524: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854977.39526: Calling groups_plugins_play to load vars for managed_node2 15406 1726854977.40711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854977.42346: done with get_vars() 15406 1726854977.42371: done queuing things up, now waiting for results queue to drain 15406 1726854977.42373: results queue empty 15406 1726854977.42374: checking for any_errors_fatal 15406 1726854977.42377: done checking for any_errors_fatal 15406 1726854977.42378: checking for max_fail_percentage 15406 1726854977.42379: done checking for max_fail_percentage 15406 1726854977.42380: checking to see if all hosts have failed and the running result is not ok 15406 1726854977.42381: done checking to see if all hosts have failed 15406 1726854977.42381: getting the remaining hosts for this loop 15406 1726854977.42382: done getting the remaining hosts for this loop 15406 1726854977.42385: getting the next task for host managed_node2 15406 1726854977.42391: done getting next task for host managed_node2 15406 1726854977.42393: ^ task is: TASK: meta (flush_handlers) 15406 1726854977.42394: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854977.42401: getting variables 15406 1726854977.42402: in VariableManager get_vars() 15406 1726854977.42411: Calling all_inventory to load vars for managed_node2 15406 1726854977.42414: Calling groups_inventory to load vars for managed_node2 15406 1726854977.42416: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854977.42422: Calling all_plugins_play to load vars for managed_node2 15406 1726854977.42424: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854977.42427: Calling groups_plugins_play to load vars for managed_node2 15406 1726854977.43627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854977.45191: done with get_vars() 15406 1726854977.45224: done getting variables 15406 1726854977.45271: in VariableManager get_vars() 15406 1726854977.45281: Calling all_inventory to load vars for managed_node2 15406 1726854977.45283: Calling groups_inventory to load vars for managed_node2 15406 1726854977.45285: Calling all_plugins_inventory to load vars for managed_node2 15406 1726854977.45292: Calling all_plugins_play to load vars for managed_node2 15406 1726854977.45295: Calling groups_plugins_inventory to load vars for managed_node2 15406 1726854977.45297: Calling groups_plugins_play to load vars for managed_node2 15406 1726854977.46429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15406 1726854977.48111: done with get_vars() 15406 1726854977.48136: done queuing things up, now waiting for results queue to drain 15406 1726854977.48143: results queue empty 15406 1726854977.48144: checking for any_errors_fatal 15406 1726854977.48146: done checking for any_errors_fatal 15406 1726854977.48147: checking for max_fail_percentage 15406 1726854977.48148: done checking for max_fail_percentage 15406 1726854977.48148: checking to see if all hosts have failed and the running result is not ok 15406 1726854977.48149: done checking to see if all hosts have failed 15406 1726854977.48150: getting the remaining hosts for this loop 15406 1726854977.48151: done getting the remaining hosts for this loop 15406 1726854977.48154: getting the next task for host managed_node2 15406 1726854977.48158: done getting next task for host managed_node2 15406 1726854977.48158: ^ task is: None 15406 1726854977.48160: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15406 1726854977.48161: done queuing things up, now waiting for results queue to drain 15406 1726854977.48162: results queue empty 15406 1726854977.48162: checking for any_errors_fatal 15406 1726854977.48163: done checking for any_errors_fatal 15406 1726854977.48164: checking for max_fail_percentage 15406 1726854977.48165: done checking for max_fail_percentage 15406 1726854977.48166: checking to see if all hosts have failed and the running result is not ok 15406 1726854977.48166: done checking to see if all hosts have failed 15406 1726854977.48167: getting the next task for host managed_node2 15406 1726854977.48169: done getting next task for host managed_node2 15406 1726854977.48170: ^ task is: None 15406 1726854977.48171: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Friday 20 September 2024 13:56:17 -0400 (0:00:00.662) 0:00:45.305 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.11s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.86s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.66s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.57s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.20s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.99s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 0.92s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.80s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 15406 1726854977.48294: RUNNING CLEANUP